Sep 30 17:17:38 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 17:17:38 crc restorecon[4679]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 17:17:38 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:39 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 17:17:40 crc restorecon[4679]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 17:17:41 crc kubenswrapper[4778]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:17:41 crc kubenswrapper[4778]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 17:17:41 crc kubenswrapper[4778]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:17:41 crc kubenswrapper[4778]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:17:41 crc kubenswrapper[4778]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 17:17:41 crc kubenswrapper[4778]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.418504 4778 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423880 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423913 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423922 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423931 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423940 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423950 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423966 4778 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423975 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423983 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.423991 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424000 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424008 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424016 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424023 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424031 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424039 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424047 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424055 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424062 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424070 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424078 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424086 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424094 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424102 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424110 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424118 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424125 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424133 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424144 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424155 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424165 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424174 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424182 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424190 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424198 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424207 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424215 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424223 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424232 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424242 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424252 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424260 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424268 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424277 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424285 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424293 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424300 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424309 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424316 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424324 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424332 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424340 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424349 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424357 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424365 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424373 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424381 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424390 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424397 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424405 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424415 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424423 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424434 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424443 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424451 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424459 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424468 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424478 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424491 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424505 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.424514 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427490 4778 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427518 4778 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427535 4778 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427549 4778 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427561 4778 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427571 4778 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427585 4778 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427599 4778 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427609 4778 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427645 4778 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427655 4778 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427664 4778 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427674 4778 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427684 4778 flags.go:64] FLAG: --cgroup-root="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427693 4778 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427702 4778 flags.go:64] FLAG: --client-ca-file="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427710 4778 flags.go:64] FLAG: --cloud-config="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427719 4778 flags.go:64] FLAG: --cloud-provider="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427728 4778 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427740 4778 flags.go:64] FLAG: --cluster-domain="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427749 4778 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427758 4778 flags.go:64] FLAG: --config-dir="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427767 4778 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427778 4778 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427790 4778 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427800 4778 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427809 4778 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427820 4778 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427830 4778 flags.go:64] FLAG: --contention-profiling="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427839 4778 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427850 4778 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427859 4778 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427868 4778 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427880 4778 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427890 4778 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427899 4778 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427908 4778 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427917 4778 flags.go:64] FLAG: --enable-server="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427926 4778 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427939 4778 flags.go:64] FLAG: --event-burst="100" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427948 4778 flags.go:64] FLAG: --event-qps="50" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427957 4778 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427967 4778 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427977 4778 flags.go:64] FLAG: --eviction-hard="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427989 4778 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.427998 4778 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428007 4778 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428017 4778 flags.go:64] FLAG: --eviction-soft="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428026 4778 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428035 4778 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428044 4778 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428054 4778 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428063 4778 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428071 4778 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428081 4778 flags.go:64] FLAG: --feature-gates="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428093 4778 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428102 4778 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428112 4778 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428122 4778 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428132 4778 flags.go:64] FLAG: --healthz-port="10248" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428141 4778 flags.go:64] FLAG: --help="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428151 4778 flags.go:64] FLAG: --hostname-override="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428162 4778 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428172 4778 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428181 4778 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428190 4778 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428199 4778 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428208 4778 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428217 4778 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428226 4778 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428235 4778 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428245 4778 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428254 4778 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428263 4778 flags.go:64] FLAG: --kube-reserved="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428272 4778 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428281 4778 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428291 4778 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428300 4778 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428309 4778 flags.go:64] FLAG: --lock-file="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428319 4778 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428329 4778 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428339 4778 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428355 4778 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428364 4778 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428373 4778 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428384 4778 flags.go:64] FLAG: --logging-format="text" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428394 4778 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428404 4778 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428413 4778 flags.go:64] FLAG: --manifest-url="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428423 4778 flags.go:64] FLAG: --manifest-url-header="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428435 4778 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428445 4778 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428456 4778 flags.go:64] FLAG: --max-pods="110" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428466 4778 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428476 4778 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428485 4778 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428493 4778 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428503 4778 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428512 4778 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428521 4778 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428542 4778 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428551 4778 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428561 4778 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428570 4778 flags.go:64] FLAG: --pod-cidr="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428579 4778 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428593 4778 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428602 4778 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428611 4778 flags.go:64] FLAG: --pods-per-core="0" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428647 4778 flags.go:64] FLAG: --port="10250" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428658 4778 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428667 4778 flags.go:64] FLAG: --provider-id="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428676 4778 flags.go:64] FLAG: --qos-reserved="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428686 4778 flags.go:64] FLAG: --read-only-port="10255" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428695 4778 flags.go:64] FLAG: --register-node="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428704 4778 flags.go:64] FLAG: --register-schedulable="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428713 4778 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428729 4778 flags.go:64] FLAG: --registry-burst="10" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428739 4778 flags.go:64] FLAG: --registry-qps="5" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428747 4778 flags.go:64] FLAG: --reserved-cpus="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428756 4778 flags.go:64] FLAG: --reserved-memory="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428768 4778 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428777 4778 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428787 4778 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428796 4778 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428805 4778 flags.go:64] FLAG: --runonce="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428814 4778 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428824 4778 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428834 4778 flags.go:64] FLAG: --seccomp-default="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428843 4778 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428852 4778 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428861 4778 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428871 4778 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428881 4778 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428890 4778 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428899 4778 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428908 4778 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428917 4778 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428926 4778 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428936 4778 flags.go:64] FLAG: --system-cgroups="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428945 4778 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428959 4778 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428973 4778 flags.go:64] FLAG: --tls-cert-file="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428982 4778 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.428994 4778 flags.go:64] FLAG: --tls-min-version="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429003 4778 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429012 4778 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429021 4778 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429030 4778 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429040 4778 flags.go:64] FLAG: --v="2" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429060 4778 flags.go:64] FLAG: --version="false" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429072 4778 flags.go:64] FLAG: --vmodule="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429088 4778 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.429098 4778 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429336 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429347 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429355 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429363 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429371 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429379 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429387 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429395 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429403 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429410 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429418 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429426 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429434 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429445 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429455 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429464 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429473 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429481 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429489 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429497 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429510 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429518 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429526 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429534 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429543 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429552 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429560 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429569 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429577 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429586 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429598 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429606 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429657 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429666 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429675 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429683 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429691 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429699 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429707 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429715 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429723 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429731 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429739 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429747 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429757 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429765 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429772 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429781 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429788 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429796 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429804 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429814 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429831 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429840 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429848 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429856 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429864 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429872 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429880 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429888 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429896 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429903 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429915 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429923 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429933 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429942 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429952 4778 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429961 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429971 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429979 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.429994 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.430020 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.447413 4778 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.447459 4778 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447584 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447601 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447638 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447649 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447658 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447667 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447675 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447686 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447696 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447706 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447716 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447725 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447734 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447743 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447751 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447759 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447768 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447777 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447786 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447794 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447802 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447811 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447819 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447828 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447836 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447844 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447853 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447861 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447869 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447878 4778 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447902 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447913 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447923 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447934 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447942 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447952 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447961 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447969 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447978 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447986 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.447995 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448003 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448016 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448025 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448034 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448043 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448051 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448061 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448088 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448097 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448105 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448113 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448121 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448130 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448138 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448147 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448155 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448164 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448172 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448181 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448189 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448196 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448205 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448213 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448225 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448236 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448246 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448255 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448263 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448272 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448280 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.448293 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448539 4778 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448552 4778 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448562 4778 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448572 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448584 4778 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448592 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448600 4778 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448608 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448640 4778 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448649 4778 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448657 4778 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448666 4778 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448674 4778 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448683 4778 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448691 4778 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448699 4778 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448707 4778 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448716 4778 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448724 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448733 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448741 4778 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448752 4778 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448761 4778 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448769 4778 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448777 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448785 4778 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448793 4778 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448801 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448809 4778 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448819 4778 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448828 4778 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448836 4778 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448847 4778 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448855 4778 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448865 4778 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448875 4778 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448885 4778 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448895 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448904 4778 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448913 4778 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448922 4778 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448930 4778 feature_gate.go:330] unrecognized feature gate: Example Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448938 4778 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448946 4778 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448955 4778 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448962 4778 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448971 4778 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448978 4778 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448986 4778 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.448997 4778 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449006 4778 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449015 4778 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449023 4778 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449031 4778 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449039 4778 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449047 4778 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449055 4778 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449065 4778 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449073 4778 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449082 4778 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449090 4778 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449098 4778 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449109 4778 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449120 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449129 4778 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449139 4778 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449148 4778 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449156 4778 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449165 4778 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449174 4778 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.449182 4778 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.449194 4778 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.449431 4778 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.459124 4778 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.459272 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.462779 4778 server.go:997] "Starting client certificate rotation" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.462831 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.463705 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-09 03:33:35.469491072 +0000 UTC Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.463814 4778 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1666h15m54.005682457s for next certificate rotation Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.490016 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.493006 4778 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.519495 4778 log.go:25] "Validated CRI v1 runtime API" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.566916 4778 log.go:25] "Validated CRI v1 image API" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.569722 4778 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.578886 4778 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-17-12-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.578977 4778 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.615324 4778 manager.go:217] Machine: {Timestamp:2025-09-30 17:17:41.609591351 +0000 UTC m=+0.599489224 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799886 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c BootID:11b51a32-4054-4a08-9c60-c43cf343227b Filesystems:[{Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d4:b5:a9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d4:b5:a9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:39:9f:0a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4e:29:3a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:24:b7:5b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1c:5b:93 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ae:d2:42:30:57:51 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:f6:de:a7:9d:23:91 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.615807 4778 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.616095 4778 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.619074 4778 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.619530 4778 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.619679 4778 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.620098 4778 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.620157 4778 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.620848 4778 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.620924 4778 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.621932 4778 state_mem.go:36] "Initialized new in-memory state store" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.622789 4778 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.627339 4778 kubelet.go:418] "Attempting to sync node with API server" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.627392 4778 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.627498 4778 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.627532 4778 kubelet.go:324] "Adding apiserver pod source" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.627560 4778 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.632545 4778 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.634803 4778 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.637141 4778 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.638029 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.638129 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.638270 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.638355 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639110 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639169 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639185 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639200 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639223 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639239 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639255 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639279 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639298 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639316 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639337 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.639361 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.641411 4778 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.642361 4778 server.go:1280] "Started kubelet" Sep 30 17:17:41 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.644830 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.645413 4778 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.645418 4778 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.645921 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.645955 4778 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.645992 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 07:08:23.251911335 +0000 UTC Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.646061 4778 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1453h50m41.605852537s for next certificate rotation Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.646193 4778 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.646205 4778 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.646209 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.646302 4778 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.654428 4778 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.657206 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="200ms" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.657802 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.657902 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.657979 4778 server.go:460] "Adding debug handlers to kubelet server" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658071 4778 factory.go:55] Registering systemd factory Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658106 4778 factory.go:221] Registration of the systemd container factory successfully Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658476 4778 factory.go:153] Registering CRI-O factory Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658507 4778 factory.go:221] Registration of the crio container factory successfully Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658593 4778 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658643 4778 factory.go:103] Registering Raw factory Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.658837 4778 manager.go:1196] Started watching for new ooms in manager Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.665891 4778 manager.go:319] Starting recovery of all containers Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.664594 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a1ef98f59a788 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 17:17:41.642307464 +0000 UTC m=+0.632205297,LastTimestamp:2025-09-30 17:17:41.642307464 +0000 UTC m=+0.632205297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669425 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669509 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669536 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669559 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669577 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669595 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669636 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669660 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669683 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669703 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669721 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669740 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669759 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669779 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669799 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669815 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669831 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669848 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669867 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669886 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669903 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669952 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.669975 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670000 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670021 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670039 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670063 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670083 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670105 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670124 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670145 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670197 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670216 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670236 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670259 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670282 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670302 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670322 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670342 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670363 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670383 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670401 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670420 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670437 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670457 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670474 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670495 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670514 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670533 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670550 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670568 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670587 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670668 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670695 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670714 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670737 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670759 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670779 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670799 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670819 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670838 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670856 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670878 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670897 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670918 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670941 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670962 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670979 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.670995 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671012 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671030 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671047 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671061 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671077 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671093 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671110 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671127 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671143 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671162 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671177 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671225 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671251 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671270 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671289 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671306 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671323 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671341 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671359 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671375 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671398 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671415 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671432 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671449 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671468 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671488 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671507 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671527 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671545 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671566 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671585 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671604 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671647 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671665 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671684 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671760 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671784 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671808 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671828 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671847 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671870 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671893 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671914 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671936 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671957 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671975 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.671993 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672010 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672127 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672150 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672301 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672331 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672356 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672374 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672392 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672410 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672430 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672459 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672477 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672496 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672524 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672549 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672566 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672582 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672599 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672640 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672659 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672676 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672694 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672711 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672760 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672780 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672798 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672818 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672835 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672852 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672869 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672886 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672903 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672940 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672962 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.672982 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673000 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673021 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673043 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673063 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673082 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673100 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673120 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673139 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673160 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673180 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673200 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673219 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673239 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673258 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673277 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673298 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673317 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673337 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673357 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673376 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673394 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673417 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673439 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673458 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673478 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673495 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673515 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673534 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673554 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673571 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673593 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673610 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673653 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673823 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673845 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673863 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673883 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673900 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673918 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673937 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673956 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673972 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.673991 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674015 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674033 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674065 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674083 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674100 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674117 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674135 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674154 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674172 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674190 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.674213 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.677748 4778 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.677808 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.677834 4778 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.677851 4778 reconstruct.go:97] "Volume reconstruction finished" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.677864 4778 reconciler.go:26] "Reconciler: start to sync state" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.682164 4778 manager.go:324] Recovery completed Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.693533 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.695515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.695597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.695634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.698537 4778 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.698567 4778 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.698589 4778 state_mem.go:36] "Initialized new in-memory state store" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.710866 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.712595 4778 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.712648 4778 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.712678 4778 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.712727 4778 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 17:17:41 crc kubenswrapper[4778]: W0930 17:17:41.713220 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.713269 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.722330 4778 policy_none.go:49] "None policy: Start" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.724121 4778 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.724160 4778 state_mem.go:35] "Initializing new in-memory state store" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.747220 4778 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.783739 4778 manager.go:334] "Starting Device Plugin manager" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.783932 4778 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.784030 4778 server.go:79] "Starting device plugin registration server" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.784714 4778 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.784803 4778 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.785171 4778 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.785358 4778 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.785441 4778 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.795689 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.812860 4778 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.812975 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.814184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.814347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.814444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.814725 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.814923 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.814965 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.815901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.815954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.815904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.815972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.815990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.816006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.816208 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.816427 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.816475 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.817735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.817766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.817780 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.817736 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.817852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.817871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.818086 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.818246 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.818350 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819256 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819422 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819464 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819561 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.819599 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820487 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.820522 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.821300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.821338 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.821352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.858882 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="400ms" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879401 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879430 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879459 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879590 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879645 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879684 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879760 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879785 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879812 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.879852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.885147 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.886720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.886764 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.886776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.886807 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:17:41 crc kubenswrapper[4778]: E0930 17:17:41.887354 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981325 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981373 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981403 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981462 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981507 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981582 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981599 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981638 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981673 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981885 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981918 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981978 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.981907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982079 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982106 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982184 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982178 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982164 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:41 crc kubenswrapper[4778]: I0930 17:17:41.982165 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.087891 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.090947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.091007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.091033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.091076 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:17:42 crc kubenswrapper[4778]: E0930 17:17:42.091527 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.165711 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.179449 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.185828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.209098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.215463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:42 crc kubenswrapper[4778]: W0930 17:17:42.232426 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-7d753898c66fac50a5ee59b54b1921b45f9cf1cce18ba03b1a92dde308a925e9 WatchSource:0}: Error finding container 7d753898c66fac50a5ee59b54b1921b45f9cf1cce18ba03b1a92dde308a925e9: Status 404 returned error can't find the container with id 7d753898c66fac50a5ee59b54b1921b45f9cf1cce18ba03b1a92dde308a925e9 Sep 30 17:17:42 crc kubenswrapper[4778]: W0930 17:17:42.233825 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-99dd5e7739afa59990fd4eed6de90babe0bc11ee4b6f0471c95c4cf3ceccc90d WatchSource:0}: Error finding container 99dd5e7739afa59990fd4eed6de90babe0bc11ee4b6f0471c95c4cf3ceccc90d: Status 404 returned error can't find the container with id 99dd5e7739afa59990fd4eed6de90babe0bc11ee4b6f0471c95c4cf3ceccc90d Sep 30 17:17:42 crc kubenswrapper[4778]: W0930 17:17:42.246226 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-28e83d543f31f6ba443c982dbd9644695073f22c0128af0902a18cb4627632d7 WatchSource:0}: Error finding container 28e83d543f31f6ba443c982dbd9644695073f22c0128af0902a18cb4627632d7: Status 404 returned error can't find the container with id 28e83d543f31f6ba443c982dbd9644695073f22c0128af0902a18cb4627632d7 Sep 30 17:17:42 crc kubenswrapper[4778]: E0930 17:17:42.261257 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="800ms" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.492433 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.493606 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.493689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.493704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.493734 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:17:42 crc kubenswrapper[4778]: E0930 17:17:42.494171 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Sep 30 17:17:42 crc kubenswrapper[4778]: W0930 17:17:42.613948 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:42 crc kubenswrapper[4778]: E0930 17:17:42.614027 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.646525 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:42 crc kubenswrapper[4778]: W0930 17:17:42.701379 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:42 crc kubenswrapper[4778]: E0930 17:17:42.701516 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.717194 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"99dd5e7739afa59990fd4eed6de90babe0bc11ee4b6f0471c95c4cf3ceccc90d"} Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.718805 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7d753898c66fac50a5ee59b54b1921b45f9cf1cce18ba03b1a92dde308a925e9"} Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.720306 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"28e83d543f31f6ba443c982dbd9644695073f22c0128af0902a18cb4627632d7"} Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.722218 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"da6d50bb25691579052e6ab823e68725e271cd62ff6856e40c14c7299b7699ef"} Sep 30 17:17:42 crc kubenswrapper[4778]: I0930 17:17:42.723610 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae2dc20058f3960a031d0d5ae9331d21ef57a710b830d6b5c60f56c5997c14d1"} Sep 30 17:17:42 crc kubenswrapper[4778]: W0930 17:17:42.774230 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:42 crc kubenswrapper[4778]: E0930 17:17:42.774390 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:43 crc kubenswrapper[4778]: E0930 17:17:43.062706 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="1.6s" Sep 30 17:17:43 crc kubenswrapper[4778]: W0930 17:17:43.092449 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:43 crc kubenswrapper[4778]: E0930 17:17:43.092550 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.295222 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.296546 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.296596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.296610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.296660 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:17:43 crc kubenswrapper[4778]: E0930 17:17:43.297133 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.646376 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:43 crc kubenswrapper[4778]: E0930 17:17:43.707673 4778 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a1ef98f59a788 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 17:17:41.642307464 +0000 UTC m=+0.632205297,LastTimestamp:2025-09-30 17:17:41.642307464 +0000 UTC m=+0.632205297,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.730056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.730106 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.730116 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.730134 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.730476 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.733441 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.733503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.733519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.734380 4778 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead" exitCode=0 Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.734473 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.734499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.735436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.735475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.735486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.736420 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a" exitCode=0 Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.736495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.736515 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.737152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.737177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.737188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.738430 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.738454 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343" exitCode=0 Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.738546 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.738557 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.739162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.739191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.739203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.739290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.739313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.739323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.740821 4778 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0e2727c5209e9ea11c33f67caf07f39964942e1fd413cc2c88082ecd55a3604e" exitCode=0 Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.740866 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0e2727c5209e9ea11c33f67caf07f39964942e1fd413cc2c88082ecd55a3604e"} Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.740918 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.741784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.741822 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.741833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:43 crc kubenswrapper[4778]: I0930 17:17:43.813671 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:44 crc kubenswrapper[4778]: W0930 17:17:44.407796 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:44 crc kubenswrapper[4778]: E0930 17:17:44.407907 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.645910 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:44 crc kubenswrapper[4778]: E0930 17:17:44.664991 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.13:6443: connect: connection refused" interval="3.2s" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.746789 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.746843 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.746858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.746973 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.750724 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.750788 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.750804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.757490 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758042 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758087 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758124 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.758487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.759995 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.761073 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123" exitCode=0 Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.761131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.761192 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.761954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.761982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.761992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.765101 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.765284 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.765430 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"6cce4b7a820073251d209e35cc2595c5d766a0d77d399ece693243088624c282"} Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.768579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.768639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.768650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.768581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.768813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.768829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:44 crc kubenswrapper[4778]: W0930 17:17:44.805648 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:44 crc kubenswrapper[4778]: E0930 17:17:44.805748 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.897540 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.898653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.898686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.898697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:44 crc kubenswrapper[4778]: I0930 17:17:44.898720 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:17:44 crc kubenswrapper[4778]: E0930 17:17:44.899144 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.13:6443: connect: connection refused" node="crc" Sep 30 17:17:45 crc kubenswrapper[4778]: W0930 17:17:45.275132 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.13:6443: connect: connection refused Sep 30 17:17:45 crc kubenswrapper[4778]: E0930 17:17:45.275293 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.13:6443: connect: connection refused" logger="UnhandledError" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771692 4778 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d" exitCode=0 Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771750 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d"} Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771840 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771862 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771921 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771961 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771973 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.772016 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.771975 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773796 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.774014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.774024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.773877 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.774100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.774109 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.775500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.775527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:45 crc kubenswrapper[4778]: I0930 17:17:45.775548 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.130918 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.138361 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780808 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3"} Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be"} Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780899 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698"} Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780920 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7"} Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780939 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb"} Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780958 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.780999 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.782783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.783342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.782798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.783397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.783417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:46 crc kubenswrapper[4778]: I0930 17:17:46.783360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.493776 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.494000 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.495529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.495608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.495677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.752292 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.752498 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.752539 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.754067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.754138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.754153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.783643 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.783799 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.785324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.785390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.785405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.785571 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.785770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:47 crc kubenswrapper[4778]: I0930 17:17:47.785793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.100247 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.101890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.101971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.101986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.102023 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.449291 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.620288 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.620596 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.622541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.622600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.622610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.785813 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.786831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.786881 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:48 crc kubenswrapper[4778]: I0930 17:17:48.786895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:49 crc kubenswrapper[4778]: I0930 17:17:49.512398 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:49 crc kubenswrapper[4778]: I0930 17:17:49.512713 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:49 crc kubenswrapper[4778]: I0930 17:17:49.514294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:49 crc kubenswrapper[4778]: I0930 17:17:49.514358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:49 crc kubenswrapper[4778]: I0930 17:17:49.514387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:51 crc kubenswrapper[4778]: E0930 17:17:51.796103 4778 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 17:17:52 crc kubenswrapper[4778]: I0930 17:17:52.982343 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 17:17:52 crc kubenswrapper[4778]: I0930 17:17:52.982576 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:52 crc kubenswrapper[4778]: I0930 17:17:52.983911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:52 crc kubenswrapper[4778]: I0930 17:17:52.983975 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:52 crc kubenswrapper[4778]: I0930 17:17:52.983984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.517689 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.518026 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.519758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.519805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.519821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.522520 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.798390 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.799363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.799395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:53 crc kubenswrapper[4778]: I0930 17:17:53.799406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.646908 4778 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 17:17:55 crc kubenswrapper[4778]: W0930 17:17:55.665509 4778 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.665679 4778 trace.go:236] Trace[1433057039]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:17:45.664) (total time: 10001ms): Sep 30 17:17:55 crc kubenswrapper[4778]: Trace[1433057039]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:17:55.665) Sep 30 17:17:55 crc kubenswrapper[4778]: Trace[1433057039]: [10.001190344s] [10.001190344s] END Sep 30 17:17:55 crc kubenswrapper[4778]: E0930 17:17:55.665721 4778 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.806514 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.808895 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd" exitCode=255 Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.808970 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd"} Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.809165 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.810112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.810167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.810187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.811081 4778 scope.go:117] "RemoveContainer" containerID="f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.844001 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.844079 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.848312 4778 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Sep 30 17:17:55 crc kubenswrapper[4778]: I0930 17:17:55.848398 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.517925 4778 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.518033 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.814635 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.816123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f"} Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.816359 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.817331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.817357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:56 crc kubenswrapper[4778]: I0930 17:17:56.817366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:58 crc kubenswrapper[4778]: I0930 17:17:58.621075 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:58 crc kubenswrapper[4778]: I0930 17:17:58.621369 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:58 crc kubenswrapper[4778]: I0930 17:17:58.623066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:58 crc kubenswrapper[4778]: I0930 17:17:58.623282 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:58 crc kubenswrapper[4778]: I0930 17:17:58.623360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.521712 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.521929 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.523515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.523646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.523660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.529899 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.826368 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.828026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.828079 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:17:59 crc kubenswrapper[4778]: I0930 17:17:59.828098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:00 crc kubenswrapper[4778]: E0930 17:18:00.842163 4778 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.843768 4778 trace.go:236] Trace[204591768]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:17:48.487) (total time: 12356ms): Sep 30 17:18:00 crc kubenswrapper[4778]: Trace[204591768]: ---"Objects listed" error: 12356ms (17:18:00.843) Sep 30 17:18:00 crc kubenswrapper[4778]: Trace[204591768]: [12.356192258s] [12.356192258s] END Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.843805 4778 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.844769 4778 trace.go:236] Trace[1041684935]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:17:49.608) (total time: 11236ms): Sep 30 17:18:00 crc kubenswrapper[4778]: Trace[1041684935]: ---"Objects listed" error: 11236ms (17:18:00.844) Sep 30 17:18:00 crc kubenswrapper[4778]: Trace[1041684935]: [11.236438375s] [11.236438375s] END Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.844905 4778 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 17:18:00 crc kubenswrapper[4778]: E0930 17:18:00.845777 4778 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.846514 4778 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.848171 4778 trace.go:236] Trace[746381247]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 17:17:49.291) (total time: 11557ms): Sep 30 17:18:00 crc kubenswrapper[4778]: Trace[746381247]: ---"Objects listed" error: 11556ms (17:18:00.847) Sep 30 17:18:00 crc kubenswrapper[4778]: Trace[746381247]: [11.55703416s] [11.55703416s] END Sep 30 17:18:00 crc kubenswrapper[4778]: I0930 17:18:00.848194 4778 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.606233 4778 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.638427 4778 apiserver.go:52] "Watching apiserver" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.654284 4778 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.654758 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.655352 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.655486 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.655396 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.655780 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.655905 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.655999 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.656067 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.656135 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.656212 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.658489 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.658503 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.659715 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.660758 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.661075 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.663834 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.664246 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.664868 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.668009 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.686909 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.708448 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.723472 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.747270 4778 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.751991 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752058 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752086 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752110 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752128 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752144 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752158 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752179 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752195 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752211 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752228 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752244 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752264 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752281 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752301 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752322 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752344 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752368 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752390 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752412 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752446 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752464 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752504 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752542 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752560 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752578 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752617 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752637 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752653 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752689 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752708 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752854 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752876 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752912 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752929 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752967 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752982 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.752998 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753029 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753028 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753063 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753153 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753178 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753196 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753216 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753237 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753259 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753305 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753311 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753356 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753403 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753420 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753436 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753455 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753487 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753502 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753518 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753522 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753555 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753570 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753585 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753601 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753623 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753690 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753707 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753744 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753760 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753774 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753803 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753822 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753839 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753855 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753888 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753910 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753938 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753956 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753968 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.753972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754040 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754121 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754247 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754282 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754291 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754307 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754335 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754363 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754388 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754414 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754623 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754713 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754739 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754758 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754778 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754801 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754827 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754852 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754872 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754891 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754912 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754936 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754961 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.754984 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755002 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755023 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755044 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755062 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755088 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755111 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755131 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755153 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755143 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755179 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755209 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755234 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755259 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755286 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755316 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755345 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755402 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755433 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755961 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756002 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.755988 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756012 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756362 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756554 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756697 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756730 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756795 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756819 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756841 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756869 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756895 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756923 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756975 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.756995 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757012 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757029 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757063 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757084 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757119 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757136 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757159 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757178 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757197 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757220 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757247 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757267 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757433 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757466 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757488 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757511 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757840 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757864 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757879 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757914 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757931 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757967 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.757984 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758000 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758017 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758034 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758054 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758070 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758087 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758124 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758143 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758178 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758197 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758215 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758231 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758248 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758266 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758283 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758303 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758320 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758337 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758363 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758419 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758437 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758456 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758475 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758496 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758513 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758533 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758582 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758618 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758648 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758688 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758731 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758786 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758809 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758850 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758965 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758985 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.758998 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759012 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759026 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759038 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759050 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759063 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759076 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759089 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759101 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759114 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759129 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759141 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759152 4778 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759162 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.759172 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.760570 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.760872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.761164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.762116 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.762202 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.762539 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.762814 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.762812 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.772111 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.772458 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.772551 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.772585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.772682 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.772959 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773044 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773222 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773216 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773231 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773416 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773436 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773498 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773529 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773600 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773847 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.773965 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.774253 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.774373 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.774426 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.774554 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.774589 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.774750 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.775784 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.776232 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.776333 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.777128 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.777229 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.777403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.777432 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.777951 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.778734 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.779304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.779540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.779567 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.779673 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.779942 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.780081 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.780598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.781589 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.782860 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.783739 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.783956 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.784194 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.784524 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.785132 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.785386 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.785665 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.786542 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.788680 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.788967 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.789399 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.789935 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.790017 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.790193 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.790371 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.790525 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.790710 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.790960 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.792166 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.792493 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.792511 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.792624 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.792964 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.793307 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.793602 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.793848 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.794147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.794521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.794638 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.794827 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.795865 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.796173 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.796393 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.796621 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.796951 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.802872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803148 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803560 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803749 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803599 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803937 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.803934 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804064 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804169 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804248 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804306 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804372 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804558 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804601 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804689 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.804876 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.805440 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.805564 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.805884 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.806031 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.807025 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:18:02.306997641 +0000 UTC m=+21.296895444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.811205 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.815344 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.815632 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.815634 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.815892 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.816269 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.816543 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.816984 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.817339 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.817807 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.817913 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.818057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.818131 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.819219 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.819470 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.819575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.819595 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:02.319565749 +0000 UTC m=+21.309463562 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.819855 4778 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.821295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.821309 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.820332 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.821505 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.815841 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.821740 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.822035 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.819741 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.822293 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:02.322256155 +0000 UTC m=+21.312153958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.819886 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.820248 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.820471 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.820822 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.820873 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.820971 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.821185 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.822573 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.822828 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.823254 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.823532 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.824084 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.826941 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.827009 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.828029 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.828064 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.828079 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.828177 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:02.328154801 +0000 UTC m=+21.318052604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.828982 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.829062 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.829081 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.829214 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:02.329169374 +0000 UTC m=+21.319067177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:01 crc kubenswrapper[4778]: E0930 17:18:01.833834 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f.scope\": RecentStats: unable to find data in memory cache]" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.836320 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.836375 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.836386 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.836709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837007 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.836954 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.836999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837450 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837736 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837774 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837849 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837924 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.837970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838002 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838060 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838294 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838386 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838527 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838763 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.838961 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839007 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839328 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839540 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839500 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839604 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839862 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.839879 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.840052 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.840256 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.841779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.849386 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.849517 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.849838 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.850248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.850403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.851086 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.853106 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.857380 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.860980 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861064 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861076 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861085 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861094 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861103 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861114 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861123 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861131 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861140 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861149 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861160 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861221 4778 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861232 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861244 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861257 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861267 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861279 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861346 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861360 4778 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861376 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861388 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861399 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861409 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861420 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861430 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861439 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861450 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861460 4778 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861470 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861480 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861489 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861499 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861510 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861520 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861532 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861543 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861553 4778 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861562 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861571 4778 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861581 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861590 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861602 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861646 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861660 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861672 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861682 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861692 4778 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861702 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861711 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861721 4778 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861730 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861741 4778 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861749 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861759 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861768 4778 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861779 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861788 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861799 4778 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861808 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861824 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861833 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861842 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861850 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861859 4778 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861868 4778 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861877 4778 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861887 4778 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861896 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861905 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861914 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861948 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861956 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861965 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861975 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861983 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.861992 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862000 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862008 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862016 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862025 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862034 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862042 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862050 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862059 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862067 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862077 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862085 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862094 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862104 4778 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862113 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862121 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862131 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862140 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862149 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862158 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862166 4778 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862176 4778 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862185 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862193 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862201 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862210 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862218 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862226 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862235 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862243 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862251 4778 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862265 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862275 4778 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862283 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862292 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862300 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862309 4778 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862317 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862326 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862334 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862342 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862351 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862361 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862370 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862379 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862405 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862414 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862422 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862431 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862440 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862451 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862460 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862469 4778 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862477 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862486 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862494 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862503 4778 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862514 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862522 4778 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862530 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862540 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862549 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862558 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862567 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862575 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862583 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862591 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862600 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862608 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862619 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.862920 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863395 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863404 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863414 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863423 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863434 4778 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863644 4778 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863653 4778 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863678 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863689 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863698 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863707 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863718 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863726 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863734 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863743 4778 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863752 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863760 4778 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863769 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863777 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863787 4778 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863795 4778 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863804 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863812 4778 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863822 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863832 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863842 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863851 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.863860 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.866731 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.874818 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.878470 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.880136 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.880408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.889907 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.901361 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.916370 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.926590 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.965040 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.965076 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.965086 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.969964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.978582 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 17:18:01 crc kubenswrapper[4778]: I0930 17:18:01.984634 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 17:18:01 crc kubenswrapper[4778]: W0930 17:18:01.992167 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-4ed4fcadbedb717b53f0367dc9c6b4a8a84454d2d39594e518c4d1fa9a717bd7 WatchSource:0}: Error finding container 4ed4fcadbedb717b53f0367dc9c6b4a8a84454d2d39594e518c4d1fa9a717bd7: Status 404 returned error can't find the container with id 4ed4fcadbedb717b53f0367dc9c6b4a8a84454d2d39594e518c4d1fa9a717bd7 Sep 30 17:18:01 crc kubenswrapper[4778]: W0930 17:18:01.998433 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6885dc646298b5adebb3f6d73635dcf34bab67be9562a6ed97f475da9443cb31 WatchSource:0}: Error finding container 6885dc646298b5adebb3f6d73635dcf34bab67be9562a6ed97f475da9443cb31: Status 404 returned error can't find the container with id 6885dc646298b5adebb3f6d73635dcf34bab67be9562a6ed97f475da9443cb31 Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.364540 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-v2b4f"] Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.364992 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.368696 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.369416 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.370417 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.370499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.370527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.370559 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.370602 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370691 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:18:03.370664242 +0000 UTC m=+22.360562045 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370688 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370744 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:03.370736625 +0000 UTC m=+22.360634428 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370748 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370688 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370758 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370785 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:03.370778305 +0000 UTC m=+22.360676098 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370784 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370802 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370825 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:03.370819837 +0000 UTC m=+22.360717640 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370762 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370838 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.370856 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:03.370851318 +0000 UTC m=+22.360749121 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.372348 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.386870 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.401189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.418188 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.434050 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.447658 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.458079 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.469702 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.470995 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpdtj\" (UniqueName: \"kubernetes.io/projected/5a2b9e51-adbe-4bba-9e7c-facada66c035-kube-api-access-gpdtj\") pod \"node-resolver-v2b4f\" (UID: \"5a2b9e51-adbe-4bba-9e7c-facada66c035\") " pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.471084 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a2b9e51-adbe-4bba-9e7c-facada66c035-hosts-file\") pod \"node-resolver-v2b4f\" (UID: \"5a2b9e51-adbe-4bba-9e7c-facada66c035\") " pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.571966 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a2b9e51-adbe-4bba-9e7c-facada66c035-hosts-file\") pod \"node-resolver-v2b4f\" (UID: \"5a2b9e51-adbe-4bba-9e7c-facada66c035\") " pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.572020 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpdtj\" (UniqueName: \"kubernetes.io/projected/5a2b9e51-adbe-4bba-9e7c-facada66c035-kube-api-access-gpdtj\") pod \"node-resolver-v2b4f\" (UID: \"5a2b9e51-adbe-4bba-9e7c-facada66c035\") " pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.572119 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5a2b9e51-adbe-4bba-9e7c-facada66c035-hosts-file\") pod \"node-resolver-v2b4f\" (UID: \"5a2b9e51-adbe-4bba-9e7c-facada66c035\") " pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.599530 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpdtj\" (UniqueName: \"kubernetes.io/projected/5a2b9e51-adbe-4bba-9e7c-facada66c035-kube-api-access-gpdtj\") pod \"node-resolver-v2b4f\" (UID: \"5a2b9e51-adbe-4bba-9e7c-facada66c035\") " pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.676549 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v2b4f" Sep 30 17:18:02 crc kubenswrapper[4778]: W0930 17:18:02.697066 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a2b9e51_adbe_4bba_9e7c_facada66c035.slice/crio-0f82497dcf39c7bc1c5cd334623cdfba92732713ae82c251c2e9c1406b22b6c4 WatchSource:0}: Error finding container 0f82497dcf39c7bc1c5cd334623cdfba92732713ae82c251c2e9c1406b22b6c4: Status 404 returned error can't find the container with id 0f82497dcf39c7bc1c5cd334623cdfba92732713ae82c251c2e9c1406b22b6c4 Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.713042 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.713194 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.737688 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vmbxd"] Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.738115 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.739069 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-f5fmb"] Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.739381 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-cwrn6"] Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.739530 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.739991 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.740080 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.741441 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.741798 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.743248 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.743284 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.743551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.743624 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.743633 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.743837 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.745042 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.745090 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.746362 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.766214 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772750 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-conf-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772792 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-os-release\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772813 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-cni-bin\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqcg\" (UniqueName: \"kubernetes.io/projected/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-kube-api-access-jmqcg\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-os-release\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772866 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ac448347-b650-429e-9e31-f8f9b7565f6e-rootfs\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772884 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-netns\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772902 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac448347-b650-429e-9e31-f8f9b7565f6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772919 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-cnibin\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772945 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-cni-binary-copy\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772972 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-socket-dir-parent\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.772987 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-etc-kubernetes\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773091 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsgb\" (UniqueName: \"kubernetes.io/projected/1737d305-b819-48f8-b703-6b5549129dd2-kube-api-access-jjsgb\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773323 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-cni-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773339 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-kubelet\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773355 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-cni-multus\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773382 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-multus-certs\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-daemon-config\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773415 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1737d305-b819-48f8-b703-6b5549129dd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773438 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-hostroot\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773465 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1737d305-b819-48f8-b703-6b5549129dd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-cnibin\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-system-cni-dir\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773539 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-system-cni-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773553 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-k8s-cni-cncf-io\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773568 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac448347-b650-429e-9e31-f8f9b7565f6e-proxy-tls\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.773588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv5jd\" (UniqueName: \"kubernetes.io/projected/ac448347-b650-429e-9e31-f8f9b7565f6e-kube-api-access-wv5jd\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.782118 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.796151 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.807583 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.821618 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.836523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.836588 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.836600 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e10d6f5637533caa246f2e22b2b9e6efdb08bb2e54ec07a12e9226ef014fbc39"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.838373 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v2b4f" event={"ID":"5a2b9e51-adbe-4bba-9e7c-facada66c035","Type":"ContainerStarted","Data":"0f82497dcf39c7bc1c5cd334623cdfba92732713ae82c251c2e9c1406b22b6c4"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.839448 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.841067 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.841117 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"4ed4fcadbedb717b53f0367dc9c6b4a8a84454d2d39594e518c4d1fa9a717bd7"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.844132 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.845302 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.847064 4778 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f" exitCode=255 Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.847153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.847226 4778 scope.go:117] "RemoveContainer" containerID="f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.848798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6885dc646298b5adebb3f6d73635dcf34bab67be9562a6ed97f475da9443cb31"} Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.857591 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.868504 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874581 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-cni-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874721 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-cni-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874628 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-kubelet\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-kubelet\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874796 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-cni-multus\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874859 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-multus-certs\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-hostroot\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-daemon-config\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874932 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1737d305-b819-48f8-b703-6b5549129dd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874972 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-hostroot\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-cni-multus\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.874989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-multus-certs\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875231 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875257 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1737d305-b819-48f8-b703-6b5549129dd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875279 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-cnibin\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875302 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-system-cni-dir\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875323 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac448347-b650-429e-9e31-f8f9b7565f6e-proxy-tls\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875347 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-system-cni-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875370 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-k8s-cni-cncf-io\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875408 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv5jd\" (UniqueName: \"kubernetes.io/projected/ac448347-b650-429e-9e31-f8f9b7565f6e-kube-api-access-wv5jd\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-system-cni-dir\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875443 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-cnibin\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875487 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-conf-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-k8s-cni-cncf-io\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875539 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-system-cni-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875458 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-conf-dir\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-os-release\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875694 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-cni-bin\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqcg\" (UniqueName: \"kubernetes.io/projected/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-kube-api-access-jmqcg\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-os-release\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ac448347-b650-429e-9e31-f8f9b7565f6e-rootfs\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875791 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-netns\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-os-release\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875827 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-os-release\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875839 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac448347-b650-429e-9e31-f8f9b7565f6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875855 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-run-netns\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875871 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-cnibin\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875889 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ac448347-b650-429e-9e31-f8f9b7565f6e-rootfs\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-cni-binary-copy\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-socket-dir-parent\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-etc-kubernetes\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.875991 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsgb\" (UniqueName: \"kubernetes.io/projected/1737d305-b819-48f8-b703-6b5549129dd2-kube-api-access-jjsgb\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-host-var-lib-cni-bin\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-cnibin\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876131 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-socket-dir-parent\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876161 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-etc-kubernetes\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876230 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1737d305-b819-48f8-b703-6b5549129dd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876492 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ac448347-b650-429e-9e31-f8f9b7565f6e-mcd-auth-proxy-config\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876695 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-multus-daemon-config\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.876986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-cni-binary-copy\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.878930 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ac448347-b650-429e-9e31-f8f9b7565f6e-proxy-tls\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.880513 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1737d305-b819-48f8-b703-6b5549129dd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.880642 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1737d305-b819-48f8-b703-6b5549129dd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.883205 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.895968 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsgb\" (UniqueName: \"kubernetes.io/projected/1737d305-b819-48f8-b703-6b5549129dd2-kube-api-access-jjsgb\") pod \"multus-additional-cni-plugins-cwrn6\" (UID: \"1737d305-b819-48f8-b703-6b5549129dd2\") " pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.897273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv5jd\" (UniqueName: \"kubernetes.io/projected/ac448347-b650-429e-9e31-f8f9b7565f6e-kube-api-access-wv5jd\") pod \"machine-config-daemon-f5fmb\" (UID: \"ac448347-b650-429e-9e31-f8f9b7565f6e\") " pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.897838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqcg\" (UniqueName: \"kubernetes.io/projected/99e0ced4-d228-4bfa-a263-b8934f0d8e5d-kube-api-access-jmqcg\") pod \"multus-vmbxd\" (UID: \"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\") " pod="openshift-multus/multus-vmbxd" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.901716 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.916062 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.929705 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.938075 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.938793 4778 scope.go:117] "RemoveContainer" containerID="a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f" Sep 30 17:18:02 crc kubenswrapper[4778]: E0930 17:18:02.939211 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.947290 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.961115 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.974861 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.988389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:02 crc kubenswrapper[4778]: I0930 17:18:02.999487 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.013546 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.014043 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.028015 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.028360 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.044495 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.065013 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vmbxd" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.072249 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.081301 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.092515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.103170 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: W0930 17:18:03.116458 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1737d305_b819_48f8_b703_6b5549129dd2.slice/crio-05f63fd9715aa3091279e9ea5012d9eb60fbf968e42372d43f9aa6bb8f97a485 WatchSource:0}: Error finding container 05f63fd9715aa3091279e9ea5012d9eb60fbf968e42372d43f9aa6bb8f97a485: Status 404 returned error can't find the container with id 05f63fd9715aa3091279e9ea5012d9eb60fbf968e42372d43f9aa6bb8f97a485 Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.148019 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.148054 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.151050 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.157899 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kzlfx"] Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.158719 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.160717 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.163435 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.164101 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.167688 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.167866 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.167997 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.173670 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.174986 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178489 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-node-log\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178540 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-etc-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178574 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-env-overrides\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178590 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx6xj\" (UniqueName: \"kubernetes.io/projected/deb38969-9012-468f-87aa-2e70a5f8f3c4-kube-api-access-mx6xj\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-netns\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178638 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-netd\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-config\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178724 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-script-lib\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178753 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-slash\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178770 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178819 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-systemd\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178847 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-log-socket\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-bin\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178881 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-kubelet\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178897 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-systemd-units\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-var-lib-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178931 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-ovn\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.178947 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovn-node-metrics-cert\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.210580 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.226709 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.244094 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:17:55Z\\\",\\\"message\\\":\\\"W0930 17:17:44.835446 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:17:44.836116 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759252664 cert, and key in /tmp/serving-cert-832660841/serving-signer.crt, /tmp/serving-cert-832660841/serving-signer.key\\\\nI0930 17:17:45.123915 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:17:45.127405 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:17:45.127632 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:17:45.129291 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-832660841/tls.crt::/tmp/serving-cert-832660841/tls.key\\\\\\\"\\\\nF0930 17:17:55.567213 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.258435 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.272030 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279587 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-netns\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279659 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-netd\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279686 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-config\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-slash\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279728 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-script-lib\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279763 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-systemd\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-netd\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-slash\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279847 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-systemd\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279825 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279908 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-log-socket\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279931 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-bin\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279932 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-log-socket\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279934 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-netns\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-bin\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279953 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-kubelet\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279992 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-systemd-units\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-var-lib-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280058 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-var-lib-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-ovn\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.279992 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-kubelet\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280094 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovn-node-metrics-cert\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-ovn\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280049 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-systemd-units\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280121 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-ovn-kubernetes\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-node-log\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280215 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-etc-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280232 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-env-overrides\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280272 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx6xj\" (UniqueName: \"kubernetes.io/projected/deb38969-9012-468f-87aa-2e70a5f8f3c4-kube-api-access-mx6xj\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280512 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-script-lib\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280568 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-node-log\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280586 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280670 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-config\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-etc-openvswitch\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.280939 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-env-overrides\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.285364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovn-node-metrics-cert\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.293556 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.296621 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx6xj\" (UniqueName: \"kubernetes.io/projected/deb38969-9012-468f-87aa-2e70a5f8f3c4-kube-api-access-mx6xj\") pod \"ovnkube-node-kzlfx\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.307122 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.319164 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.335092 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.357408 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.373439 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.381714 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.381847 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.381876 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.381900 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.381942 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:18:05.381896694 +0000 UTC m=+24.371794487 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382026 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382053 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382069 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.381974 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382141 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382157 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382168 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.382051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382077 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:05.38206785 +0000 UTC m=+24.371965653 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382234 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:05.382225595 +0000 UTC m=+24.372123398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382247 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:05.382240845 +0000 UTC m=+24.372138648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382080 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.382299 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:05.382292937 +0000 UTC m=+24.372190740 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.388349 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.401871 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.415435 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.439371 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.459828 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:17:55Z\\\",\\\"message\\\":\\\"W0930 17:17:44.835446 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:17:44.836116 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759252664 cert, and key in /tmp/serving-cert-832660841/serving-signer.crt, /tmp/serving-cert-832660841/serving-signer.key\\\\nI0930 17:17:45.123915 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:17:45.127405 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:17:45.127632 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:17:45.129291 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-832660841/tls.crt::/tmp/serving-cert-832660841/tls.key\\\\\\\"\\\\nF0930 17:17:55.567213 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.476314 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.477463 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: W0930 17:18:03.488617 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeb38969_9012_468f_87aa_2e70a5f8f3c4.slice/crio-a5691e2d24f75ec03534d5562335ee8be013408ab586b08b46003a7c95bc2b07 WatchSource:0}: Error finding container a5691e2d24f75ec03534d5562335ee8be013408ab586b08b46003a7c95bc2b07: Status 404 returned error can't find the container with id a5691e2d24f75ec03534d5562335ee8be013408ab586b08b46003a7c95bc2b07 Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.489085 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.522235 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.529769 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.535512 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.552944 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.561713 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.577778 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.592310 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:17:55Z\\\",\\\"message\\\":\\\"W0930 17:17:44.835446 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:17:44.836116 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759252664 cert, and key in /tmp/serving-cert-832660841/serving-signer.crt, /tmp/serving-cert-832660841/serving-signer.key\\\\nI0930 17:17:45.123915 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:17:45.127405 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:17:45.127632 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:17:45.129291 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-832660841/tls.crt::/tmp/serving-cert-832660841/tls.key\\\\\\\"\\\\nF0930 17:17:55.567213 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.605363 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.616972 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.630213 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.642395 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.657373 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.671576 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.690953 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.707208 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.713286 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.713306 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.713486 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.713590 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.718026 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.719111 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.720060 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.720921 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.721697 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.722337 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.723310 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.724139 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.725025 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.728049 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.728905 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.729035 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.730434 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.731147 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.731711 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.734225 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.734956 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.739093 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.739715 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.740417 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.741694 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.742351 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.744139 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.744766 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.745606 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.746934 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.747758 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.748820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.749136 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.749777 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.751239 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.751880 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.752523 4778 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.753222 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.761084 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.762685 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.763367 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.765511 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.767132 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.767413 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.767964 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.769523 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.770455 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.771715 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.773248 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.774903 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.775844 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.777250 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.777961 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.779299 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.780322 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.781778 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.782386 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.783031 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.784263 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.785071 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.786281 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.786408 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.808846 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.824306 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.843726 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.854184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v2b4f" event={"ID":"5a2b9e51-adbe-4bba-9e7c-facada66c035","Type":"ContainerStarted","Data":"67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.855484 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" exitCode=0 Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.855549 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.855581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"a5691e2d24f75ec03534d5562335ee8be013408ab586b08b46003a7c95bc2b07"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.857245 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.861167 4778 scope.go:117] "RemoveContainer" containerID="a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f" Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.861338 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.862641 4778 generic.go:334] "Generic (PLEG): container finished" podID="1737d305-b819-48f8-b703-6b5549129dd2" containerID="9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02" exitCode=0 Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.862740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerDied","Data":"9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.862774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerStarted","Data":"05f63fd9715aa3091279e9ea5012d9eb60fbf968e42372d43f9aa6bb8f97a485"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.864004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerStarted","Data":"7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.864035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerStarted","Data":"246ce9ca7fbd3f44f6e6525814cc3d0bdb2a646f2f74617b51d44537e0b382b2"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.873289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.873338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.873349 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"614ce1cc224d705656badd2f432d5eeec03a41a205f52d104ee06a49595792cd"} Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.884958 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: E0930 17:18:03.903889 4778 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.961879 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28707a3b982b80660520a4c88d97fa1f92b30c6f5c6b090d6542d788f0d47fd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:17:55Z\\\",\\\"message\\\":\\\"W0930 17:17:44.835446 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 17:17:44.836116 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759252664 cert, and key in /tmp/serving-cert-832660841/serving-signer.crt, /tmp/serving-cert-832660841/serving-signer.key\\\\nI0930 17:17:45.123915 1 observer_polling.go:159] Starting file observer\\\\nW0930 17:17:45.127405 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 17:17:45.127632 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:17:45.129291 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-832660841/tls.crt::/tmp/serving-cert-832660841/tls.key\\\\\\\"\\\\nF0930 17:17:55.567213 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:03 crc kubenswrapper[4778]: I0930 17:18:03.995279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:03Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.018928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.051963 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.089904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.133785 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.169167 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.211311 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.249263 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.294552 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.331531 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.379492 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.416270 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.449862 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.492024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.533718 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.577621 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.610841 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.650197 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.689757 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.713496 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:04 crc kubenswrapper[4778]: E0930 17:18:04.713794 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.727952 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.880430 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.880481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.880499 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.880510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.881806 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e"} Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.884243 4778 generic.go:334] "Generic (PLEG): container finished" podID="1737d305-b819-48f8-b703-6b5549129dd2" containerID="29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566" exitCode=0 Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.884377 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerDied","Data":"29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566"} Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.884872 4778 scope.go:117] "RemoveContainer" containerID="a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f" Sep 30 17:18:04 crc kubenswrapper[4778]: E0930 17:18:04.885076 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.897570 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.910686 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.940323 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.957845 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.972716 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:04 crc kubenswrapper[4778]: I0930 17:18:04.986574 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:04Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.010021 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.051920 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.092299 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.130527 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.170708 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.216191 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.247933 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.299148 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.342521 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.369089 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.409156 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.409293 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409316 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:18:09.409276565 +0000 UTC m=+28.399174368 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.409383 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409436 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409560 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.409476 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409579 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:09.409555064 +0000 UTC m=+28.399452867 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409586 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.409645 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409659 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409569 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409717 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:09.409695199 +0000 UTC m=+28.399593162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409721 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409738 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:09.409727789 +0000 UTC m=+28.399625782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409740 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409765 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.409799 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:09.409790812 +0000 UTC m=+28.399688615 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.412479 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.457996 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.488583 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.531072 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.579396 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.611876 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.650460 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.690234 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.713262 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.713309 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.713417 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:05 crc kubenswrapper[4778]: E0930 17:18:05.713585 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.729177 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.769472 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.810458 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.849619 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.889761 4778 generic.go:334] "Generic (PLEG): container finished" podID="1737d305-b819-48f8-b703-6b5549129dd2" containerID="9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477" exitCode=0 Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.889861 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerDied","Data":"9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477"} Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.893831 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.893893 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.903509 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.932432 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:05 crc kubenswrapper[4778]: I0930 17:18:05.977538 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:05Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.010628 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.050564 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.093416 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.131560 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.168384 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.210223 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.254843 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.290168 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.336256 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.369501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.409903 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.712902 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:06 crc kubenswrapper[4778]: E0930 17:18:06.713398 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.899564 4778 generic.go:334] "Generic (PLEG): container finished" podID="1737d305-b819-48f8-b703-6b5549129dd2" containerID="ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76" exitCode=0 Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.899662 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerDied","Data":"ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76"} Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.911919 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.924885 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.945210 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.960970 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.975218 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.987256 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:06 crc kubenswrapper[4778]: I0930 17:18:06.997945 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:06Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.008390 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.023029 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.037509 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.051574 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.064855 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.081231 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.101664 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.246108 4778 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.248101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.248136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.248149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.248249 4778 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.254950 4778 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.255222 4778 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.256289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.256321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.256331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.256348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.256359 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.268787 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.272275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.272300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.272309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.272325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.272336 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.283663 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.288298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.288346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.288357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.288414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.288427 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.303242 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.307279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.307330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.307342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.307364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.307376 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.320473 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.324614 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.324689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.324714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.324739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.324753 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.336872 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.336997 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.339108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.339163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.339186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.339214 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.339225 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.442117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.442163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.442172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.442187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.442196 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.544135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.544177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.544186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.544207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.544216 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.646472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.646511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.646520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.646535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.646545 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.713300 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.713353 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.713466 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:07 crc kubenswrapper[4778]: E0930 17:18:07.713552 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.749321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.749377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.749392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.749412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.749427 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.852382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.852434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.852444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.852459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.852470 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.905471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerStarted","Data":"742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.909686 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.922779 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.937732 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.952845 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.954898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.954936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.954946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.954964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.954976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:07Z","lastTransitionTime":"2025-09-30T17:18:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.973282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.986901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:07 crc kubenswrapper[4778]: I0930 17:18:07.998227 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:07Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.008363 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.020874 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.037443 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.052854 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.057383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.057439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.057450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.057465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.057478 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.066271 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.080741 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.103967 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.119671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.161241 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.161298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.161316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.161344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.161364 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.264749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.264820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.264838 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.264863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.264881 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.367563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.367641 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.367654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.367672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.367684 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.456190 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kcwn2"] Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.456573 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.458602 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.461066 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.461586 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.463190 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.470346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.470392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.470401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.470419 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.470435 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.479466 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.499471 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.516966 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.542167 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.542426 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa3bcb63-ebc5-4490-af95-b1325e664f48-host\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.542620 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa3bcb63-ebc5-4490-af95-b1325e664f48-serviceca\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.542735 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vjp\" (UniqueName: \"kubernetes.io/projected/fa3bcb63-ebc5-4490-af95-b1325e664f48-kube-api-access-x9vjp\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.554479 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.568126 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.573369 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.573423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.573435 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.573456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.573468 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.581777 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.601788 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.617749 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.632921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.643487 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vjp\" (UniqueName: \"kubernetes.io/projected/fa3bcb63-ebc5-4490-af95-b1325e664f48-kube-api-access-x9vjp\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.643560 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa3bcb63-ebc5-4490-af95-b1325e664f48-host\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.643654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa3bcb63-ebc5-4490-af95-b1325e664f48-serviceca\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.643702 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fa3bcb63-ebc5-4490-af95-b1325e664f48-host\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.645085 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fa3bcb63-ebc5-4490-af95-b1325e664f48-serviceca\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.647967 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.659023 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.661374 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vjp\" (UniqueName: \"kubernetes.io/projected/fa3bcb63-ebc5-4490-af95-b1325e664f48-kube-api-access-x9vjp\") pod \"node-ca-kcwn2\" (UID: \"fa3bcb63-ebc5-4490-af95-b1325e664f48\") " pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.673113 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.676110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.676150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.676163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.676185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.676196 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.685293 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.698591 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.713427 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:08 crc kubenswrapper[4778]: E0930 17:18:08.713691 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.777248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kcwn2" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.778869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.778954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.778971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.778989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.779025 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: W0930 17:18:08.794973 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa3bcb63_ebc5_4490_af95_b1325e664f48.slice/crio-7dfd6aca7c7f6045baf13c6ff41b22bae62ede1f7c883dd7a275372727eb97cd WatchSource:0}: Error finding container 7dfd6aca7c7f6045baf13c6ff41b22bae62ede1f7c883dd7a275372727eb97cd: Status 404 returned error can't find the container with id 7dfd6aca7c7f6045baf13c6ff41b22bae62ede1f7c883dd7a275372727eb97cd Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.883404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.883446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.883456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.883472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.883482 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.918183 4778 generic.go:334] "Generic (PLEG): container finished" podID="1737d305-b819-48f8-b703-6b5549129dd2" containerID="742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023" exitCode=0 Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.918215 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerDied","Data":"742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.919496 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kcwn2" event={"ID":"fa3bcb63-ebc5-4490-af95-b1325e664f48","Type":"ContainerStarted","Data":"7dfd6aca7c7f6045baf13c6ff41b22bae62ede1f7c883dd7a275372727eb97cd"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.937418 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.960648 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.974685 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.987127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.987160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.987170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.987189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.987198 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:08Z","lastTransitionTime":"2025-09-30T17:18:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:08 crc kubenswrapper[4778]: I0930 17:18:08.988537 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:08Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.016767 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.032747 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.048942 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.064155 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.085363 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.090603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.090671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.090683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.090721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.090732 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.101478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.115861 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.132616 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.149761 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.165031 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.183905 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.194120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.194160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.194197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.194216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.194226 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.296978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.297031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.297040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.297059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.297076 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.400683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.400748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.400766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.400794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.400813 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.452468 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.452591 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.452671 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.452697 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.452727 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453068 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453077 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453125 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453135 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:17.45311511 +0000 UTC m=+36.443012913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453143 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453176 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453268 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453338 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453222 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:17.453197132 +0000 UTC m=+36.443095115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453358 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453467 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:18:17.453403529 +0000 UTC m=+36.443301332 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453501 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:17.453485292 +0000 UTC m=+36.443383095 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.453530 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:17.453522123 +0000 UTC m=+36.443420156 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.503594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.503673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.503683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.503700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.503714 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.606598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.606689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.606701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.606721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.606734 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.709660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.709698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.709713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.709737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.709749 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.713294 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.713363 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.713476 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:09 crc kubenswrapper[4778]: E0930 17:18:09.713649 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.812062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.812604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.812622 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.812659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.812675 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.915757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.915800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.915811 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.915831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.915844 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:09Z","lastTransitionTime":"2025-09-30T17:18:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.924042 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kcwn2" event={"ID":"fa3bcb63-ebc5-4490-af95-b1325e664f48","Type":"ContainerStarted","Data":"2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.929192 4778 generic.go:334] "Generic (PLEG): container finished" podID="1737d305-b819-48f8-b703-6b5549129dd2" containerID="9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b" exitCode=0 Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.929245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerDied","Data":"9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b"} Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.937964 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.957292 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:09 crc kubenswrapper[4778]: I0930 17:18:09.973934 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.006866 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.019453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.019517 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.019535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.019945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.019984 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.030059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.046275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.066969 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.084680 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.100710 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.119956 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.123339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.123424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.123444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.123533 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.123594 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.139019 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.156501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.171391 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.189401 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.209962 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.227666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.227708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.227718 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.227918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.227927 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.227958 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.244847 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.266333 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.280330 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.291912 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.302697 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.313620 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.325163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.330014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.330047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.330059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.330076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.330088 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.338206 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.350985 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.364457 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.380458 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.390442 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.398583 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.408921 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.432607 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.432680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.432689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.432706 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.432717 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.535124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.535172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.535182 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.535198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.535209 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.637950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.637985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.637994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.638008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.638017 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.712881 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:10 crc kubenswrapper[4778]: E0930 17:18:10.713025 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.740098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.740132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.740143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.740157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.740167 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.842045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.842097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.842107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.842123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.842158 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.935580 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.935878 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.935916 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.940501 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" event={"ID":"1737d305-b819-48f8-b703-6b5549129dd2","Type":"ContainerStarted","Data":"526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.944104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.944131 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.944141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.944153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.944162 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:10Z","lastTransitionTime":"2025-09-30T17:18:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.950326 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.961173 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.962129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.971211 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.987086 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:10 crc kubenswrapper[4778]: I0930 17:18:10.997843 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:10Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.007539 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.017021 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.027776 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.047106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.047154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.047163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.047179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.047189 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.047135 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.060397 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.071853 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.086461 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.097540 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.109909 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.120064 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.143090 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.149748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.149809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.149821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.149843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.149857 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.158929 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.171356 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.185202 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.197158 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.206613 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.216137 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.229235 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.241655 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.253689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.253750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.253763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.253783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.253802 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.257948 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.278220 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.293443 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.305705 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.318802 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.345820 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.358342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.358384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.358395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.358411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.358421 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.461330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.461370 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.461382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.461397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.461408 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.564656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.564700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.564710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.564726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.564736 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.667909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.667951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.667980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.667994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.668004 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.713583 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.713599 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:11 crc kubenswrapper[4778]: E0930 17:18:11.713778 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:11 crc kubenswrapper[4778]: E0930 17:18:11.713926 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.730688 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.746140 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.758449 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.772667 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.774544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.774643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.774655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.774670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.774681 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.796419 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.815941 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.827792 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.839912 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.855381 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.869162 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.877817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.877880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.877894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.877916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.877930 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.885786 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.908193 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.926705 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.944153 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.944826 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.961211 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.974951 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.981316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.981364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.981375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.981396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.981407 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:11Z","lastTransitionTime":"2025-09-30T17:18:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:11 crc kubenswrapper[4778]: I0930 17:18:11.989596 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.004206 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.017897 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.034749 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.070986 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.084939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.084987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.084997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.085017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.085036 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.117564 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.148027 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.188085 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.188137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.188152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.188173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.188188 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.191954 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.242980 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.272425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.290938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.290996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.291010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.291036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.291051 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.310858 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.352789 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.394043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.394092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.394103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.394121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.394138 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.398389 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.440955 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.470207 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:12Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.496351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.496383 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.496391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.496406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.496415 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.598778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.598820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.598834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.598852 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.598863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.702671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.702747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.702761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.702782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.702794 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.712976 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:12 crc kubenswrapper[4778]: E0930 17:18:12.713190 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.806291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.806352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.806362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.806392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.806421 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.909328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.909370 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.909381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.909397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:12 crc kubenswrapper[4778]: I0930 17:18:12.909410 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:12Z","lastTransitionTime":"2025-09-30T17:18:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.012134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.012191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.012205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.012227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.012239 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.114933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.114986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.114999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.115017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.115047 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.218270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.218323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.218338 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.218363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.218379 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.321534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.321674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.321696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.321729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.321749 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.425117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.425163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.425176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.425195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.425206 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.528300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.528365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.528382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.528403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.528419 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.632420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.632484 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.632502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.632528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.632548 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.714098 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.714213 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:13 crc kubenswrapper[4778]: E0930 17:18:13.714288 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:13 crc kubenswrapper[4778]: E0930 17:18:13.714526 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.735211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.735254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.735268 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.735286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.735297 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.838904 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.838949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.838958 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.838973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.838982 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.942475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.942555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.942573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.942600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.942668 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:13Z","lastTransitionTime":"2025-09-30T17:18:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.991163 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/0.log" Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.995714 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374" exitCode=1 Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.995786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374"} Sep 30 17:18:13 crc kubenswrapper[4778]: I0930 17:18:13.997249 4778 scope.go:117] "RemoveContainer" containerID="0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.022966 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.043596 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.045493 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.045536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.045580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.045603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.045659 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.061933 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.076059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.094434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.109035 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.125896 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.142850 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.148569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.148662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.148678 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.148706 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.148719 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.163259 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.176072 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.195432 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.214949 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:13Z\\\",\\\"message\\\":\\\"\\\\nI0930 17:18:13.159899 6081 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:18:13.159871 6081 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:18:13.159988 6081 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 17:18:13.160043 6081 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160090 6081 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:18:13.160169 6081 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160293 6081 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.160739 6081 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.161130 6081 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.227578 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.241578 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.251562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.251636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.251651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.251677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.251695 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.256218 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:14Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.354375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.354431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.354443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.354465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.354481 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.457026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.457073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.457083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.457100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.457110 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.560526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.561021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.561033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.561052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.561066 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.664375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.664469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.664486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.664513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.664532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.713386 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:14 crc kubenswrapper[4778]: E0930 17:18:14.713561 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.767061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.767136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.767155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.767186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.767205 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.869872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.869949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.869974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.870007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.870031 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.973161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.973271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.973291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.973318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:14 crc kubenswrapper[4778]: I0930 17:18:14.973337 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:14Z","lastTransitionTime":"2025-09-30T17:18:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.004236 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/0.log" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.009663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.076819 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.076874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.076888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.076912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.076931 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.180334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.180405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.180425 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.180505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.180524 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.284104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.284159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.284172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.284191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.284211 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.387333 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.387381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.387398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.387417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.387432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.490180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.490240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.490253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.490273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.490286 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.520672 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf"] Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.521160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.523510 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.523886 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.537683 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.558141 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.587156 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.593162 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.593205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.593218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.593239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.593254 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.614171 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.622767 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.622824 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.622874 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvbvh\" (UniqueName: \"kubernetes.io/projected/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-kube-api-access-fvbvh\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.622899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.626310 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.646707 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.660691 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.675101 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.691522 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.695436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.695490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.695507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.695526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.695539 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.708465 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.713139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.713179 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:15 crc kubenswrapper[4778]: E0930 17:18:15.713331 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:15 crc kubenswrapper[4778]: E0930 17:18:15.713498 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.723776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvbvh\" (UniqueName: \"kubernetes.io/projected/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-kube-api-access-fvbvh\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.723819 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.723877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.723917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.724311 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.725032 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.725076 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.731365 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.740144 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.743533 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvbvh\" (UniqueName: \"kubernetes.io/projected/faf8b25c-5e3f-4eee-8c3f-b384e2dafa92-kube-api-access-fvbvh\") pod \"ovnkube-control-plane-749d76644c-hs6lf\" (UID: \"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.755773 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.771143 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.788671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.798842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.798895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.798905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.798923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.798935 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.807196 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:13Z\\\",\\\"message\\\":\\\"\\\\nI0930 17:18:13.159899 6081 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:18:13.159871 6081 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:18:13.159988 6081 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 17:18:13.160043 6081 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160090 6081 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:18:13.160169 6081 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160293 6081 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.160739 6081 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.161130 6081 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:15Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.835942 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" Sep 30 17:18:15 crc kubenswrapper[4778]: W0930 17:18:15.849588 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaf8b25c_5e3f_4eee_8c3f_b384e2dafa92.slice/crio-ad4dcfa1e89d1e216731946ab5f1a60c7a9deb7ade5461f93d483566cc434204 WatchSource:0}: Error finding container ad4dcfa1e89d1e216731946ab5f1a60c7a9deb7ade5461f93d483566cc434204: Status 404 returned error can't find the container with id ad4dcfa1e89d1e216731946ab5f1a60c7a9deb7ade5461f93d483566cc434204 Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.902713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.902753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.902761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.902778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:15 crc kubenswrapper[4778]: I0930 17:18:15.902788 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:15Z","lastTransitionTime":"2025-09-30T17:18:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.005252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.005296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.005306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.005328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.005343 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.013946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" event={"ID":"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92","Type":"ContainerStarted","Data":"ad4dcfa1e89d1e216731946ab5f1a60c7a9deb7ade5461f93d483566cc434204"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.014327 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.029204 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.050919 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.065540 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.081673 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.099676 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.107848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.107900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.107912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.107932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.107946 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.112711 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.124151 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.135107 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.148741 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.166484 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.185942 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.210008 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:13Z\\\",\\\"message\\\":\\\"\\\\nI0930 17:18:13.159899 6081 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:18:13.159871 6081 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:18:13.159988 6081 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 17:18:13.160043 6081 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160090 6081 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:18:13.160169 6081 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160293 6081 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.160739 6081 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.161130 6081 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.210704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.210807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.210874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.210937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.210999 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.223999 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.236893 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.250524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.261759 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.313882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.313965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.313976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.313993 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.314003 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.417274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.417341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.417354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.417377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.417392 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.520053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.520093 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.520102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.520116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.520128 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.623163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.623238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.623255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.623283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.623301 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.661527 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-l88vm"] Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.662349 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:16 crc kubenswrapper[4778]: E0930 17:18:16.662457 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.677715 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.692198 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.707327 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.713934 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:16 crc kubenswrapper[4778]: E0930 17:18:16.714142 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.714929 4778 scope.go:117] "RemoveContainer" containerID="a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.723404 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.725537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.725598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.725611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.725670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.725688 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.734281 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88lz4\" (UniqueName: \"kubernetes.io/projected/8b0c73d9-9a75-4e65-9220-904133af63fd-kube-api-access-88lz4\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.734476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.745104 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.766382 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.785533 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.806610 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.825661 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.828711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.828777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.828792 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.828817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.828832 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.835920 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.835983 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88lz4\" (UniqueName: \"kubernetes.io/projected/8b0c73d9-9a75-4e65-9220-904133af63fd-kube-api-access-88lz4\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:16 crc kubenswrapper[4778]: E0930 17:18:16.836139 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:16 crc kubenswrapper[4778]: E0930 17:18:16.836264 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:18:17.336235311 +0000 UTC m=+36.326133144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.850297 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.861454 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88lz4\" (UniqueName: \"kubernetes.io/projected/8b0c73d9-9a75-4e65-9220-904133af63fd-kube-api-access-88lz4\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.871610 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.889434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.905475 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.926069 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.932089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.932147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.932159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.932178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.932192 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:16Z","lastTransitionTime":"2025-09-30T17:18:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.948950 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:13Z\\\",\\\"message\\\":\\\"\\\\nI0930 17:18:13.159899 6081 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:18:13.159871 6081 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:18:13.159988 6081 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 17:18:13.160043 6081 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160090 6081 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:18:13.160169 6081 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160293 6081 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.160739 6081 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.161130 6081 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.965867 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:16 crc kubenswrapper[4778]: I0930 17:18:16.980173 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:16Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.020602 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.022450 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.022989 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.024296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" event={"ID":"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92","Type":"ContainerStarted","Data":"2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.024358 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" event={"ID":"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92","Type":"ContainerStarted","Data":"74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.025852 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/1.log" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.026722 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/0.log" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.031652 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3" exitCode=1 Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.031698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.031737 4778 scope.go:117] "RemoveContainer" containerID="0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.032435 4778 scope.go:117] "RemoveContainer" containerID="549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.032637 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.034278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.034306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.034318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.034335 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.034347 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.052524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.070693 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.086254 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.107545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.127811 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.136350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.136387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.136397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.136411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.136422 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.163783 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.183745 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.199871 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.213509 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.227159 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.239214 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.239265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.239274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.239293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.239307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.242520 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.261347 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:13Z\\\",\\\"message\\\":\\\"\\\\nI0930 17:18:13.159899 6081 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:18:13.159871 6081 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:18:13.159988 6081 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 17:18:13.160043 6081 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160090 6081 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:18:13.160169 6081 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160293 6081 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.160739 6081 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.161130 6081 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.276527 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.290707 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.303245 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.315015 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.324904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.338915 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.339593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.339819 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.339935 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:18:18.339908271 +0000 UTC m=+37.329806074 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.341305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.341344 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.341357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.341375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.341391 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.352908 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.370867 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.396059 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0fd80ac17327a42091762c5c1a502cfdb7e3d0d343c5f066e6b18e1e823374\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:13Z\\\",\\\"message\\\":\\\"\\\\nI0930 17:18:13.159899 6081 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 17:18:13.159871 6081 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 17:18:13.159988 6081 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0930 17:18:13.160043 6081 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160090 6081 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 17:18:13.160169 6081 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:18:13.160293 6081 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.160739 6081 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:18:13.161130 6081 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.408832 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.421526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.436079 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.443680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.443767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.443791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.443828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.443852 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.450730 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.464127 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.482217 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.541535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.541714 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.541760 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:18:33.54173087 +0000 UTC m=+52.531628683 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.541801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.541855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.541883 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.541889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.541961 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:33.541948147 +0000 UTC m=+52.531845960 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542012 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542072 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:33.54205853 +0000 UTC m=+52.531956353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542178 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542198 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542196 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542265 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542285 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542213 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542365 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:33.542340019 +0000 UTC m=+52.532237822 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.542388 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:18:33.54238078 +0000 UTC m=+52.532278583 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.546770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.546812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.546825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.546845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.546863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.557359 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.572837 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.585797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.587552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.587606 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.587636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.587664 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.587682 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.601632 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.602174 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.606320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.606384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.606395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.606418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.606429 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.619434 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.622185 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.626151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.626188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.626198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.626219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.626229 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.634339 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.638672 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.642349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.642377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.642388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.642404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.642416 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.653161 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.653803 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.657550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.657605 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.657629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.657649 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.657660 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.670202 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:17Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.670338 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.671773 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.671820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.671830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.671851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.671885 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.713704 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.713722 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.714036 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:17 crc kubenswrapper[4778]: E0930 17:18:17.714194 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.775223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.775271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.775280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.775296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.775306 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.879292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.879713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.879723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.879741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.879752 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.982654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.982965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.983027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.983097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:17 crc kubenswrapper[4778]: I0930 17:18:17.983166 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:17Z","lastTransitionTime":"2025-09-30T17:18:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.040557 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/1.log" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.046684 4778 scope.go:117] "RemoveContainer" containerID="549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3" Sep 30 17:18:18 crc kubenswrapper[4778]: E0930 17:18:18.046892 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.068254 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.082201 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.086983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.087024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.087035 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.087058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.087068 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.100219 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.115443 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.132358 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.151994 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.163439 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.178021 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.189731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.189772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.189783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.189797 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.189808 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.192484 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.207444 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.243548 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.260767 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.272675 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.290369 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.292815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.292850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.292862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.292880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.292893 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.303204 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.318673 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.330902 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:18Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.347681 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:18 crc kubenswrapper[4778]: E0930 17:18:18.348015 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:18 crc kubenswrapper[4778]: E0930 17:18:18.348079 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:18:20.348062815 +0000 UTC m=+39.337960618 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.396373 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.396442 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.396455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.396477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.396493 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.499467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.499503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.499513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.499530 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.499541 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.603091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.603137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.603147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.603164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.603173 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.706657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.706722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.706739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.706767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.706784 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.713127 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.713176 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:18 crc kubenswrapper[4778]: E0930 17:18:18.713304 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:18 crc kubenswrapper[4778]: E0930 17:18:18.713444 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.809304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.809369 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.809382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.809400 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.809412 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.912786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.912836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.912844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.912861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:18 crc kubenswrapper[4778]: I0930 17:18:18.912872 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:18Z","lastTransitionTime":"2025-09-30T17:18:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.016221 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.016296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.016316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.016342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.016360 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.120097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.120170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.120195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.120227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.120249 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.222915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.222982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.223005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.223039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.223060 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.326511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.326597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.326609 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.326694 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.326711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.429851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.429906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.429916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.429938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.429952 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.533825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.533876 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.533890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.533908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.533921 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.636337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.636398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.636414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.636440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.636460 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.713592 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.713716 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:19 crc kubenswrapper[4778]: E0930 17:18:19.713869 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:19 crc kubenswrapper[4778]: E0930 17:18:19.713977 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.739687 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.739744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.739758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.739781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.739803 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.842692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.842752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.842767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.842787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.842802 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.946529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.946585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.946599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.946638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:19 crc kubenswrapper[4778]: I0930 17:18:19.946652 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:19Z","lastTransitionTime":"2025-09-30T17:18:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.049355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.049425 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.049438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.049474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.049493 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.152948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.152999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.153009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.153026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.153038 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.256449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.256514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.256523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.256543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.256561 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.359632 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.359691 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.359702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.359721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.360061 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.376359 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:20 crc kubenswrapper[4778]: E0930 17:18:20.376561 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:20 crc kubenswrapper[4778]: E0930 17:18:20.376668 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:18:24.376646455 +0000 UTC m=+43.366544258 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.464009 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.464077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.464094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.464118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.464135 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.567120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.567188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.567208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.567237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.567257 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.671693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.671784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.671795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.671818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.671830 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.713536 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.713679 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:20 crc kubenswrapper[4778]: E0930 17:18:20.713764 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:20 crc kubenswrapper[4778]: E0930 17:18:20.713957 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.775178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.775235 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.775245 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.775265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.775278 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.878983 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.879071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.879094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.879126 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.879149 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.982537 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.982610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.982667 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.982689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:20 crc kubenswrapper[4778]: I0930 17:18:20.982702 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:20Z","lastTransitionTime":"2025-09-30T17:18:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.086175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.086242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.086260 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.086287 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.086305 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.190115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.190193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.190220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.190253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.190347 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.293300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.293381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.293401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.293431 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.293452 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.396171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.396436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.396454 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.396482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.396502 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.500283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.500339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.500353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.500376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.500390 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.603018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.603075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.603086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.603108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.603119 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.707165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.707285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.707310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.707816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.708106 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.713613 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.713737 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:21 crc kubenswrapper[4778]: E0930 17:18:21.713868 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:21 crc kubenswrapper[4778]: E0930 17:18:21.714065 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.739961 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.757928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.776755 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.794678 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.806634 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.810766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.810869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.810884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.810902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.810915 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.830090 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.851709 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.869720 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.902485 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.913041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.913095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.913105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.913121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.913131 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:21Z","lastTransitionTime":"2025-09-30T17:18:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.921392 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.939993 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.957937 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.977637 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:21 crc kubenswrapper[4778]: I0930 17:18:21.998760 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.016216 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:22Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.016477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.016562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.016589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.016661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.016703 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.034000 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:22Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.057674 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:22Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.119579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.119685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.119700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.119722 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.119737 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.223070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.223142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.223166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.223196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.223217 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.326768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.326845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.326859 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.326880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.326892 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.430105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.430170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.430186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.430207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.430220 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.534166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.534274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.534292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.534507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.534531 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.638396 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.638508 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.638521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.638542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.638560 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.714078 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.714228 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:22 crc kubenswrapper[4778]: E0930 17:18:22.714320 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:22 crc kubenswrapper[4778]: E0930 17:18:22.714420 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.741776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.741863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.741893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.741925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.741979 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.845789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.845846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.845858 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.845879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.845892 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.949457 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.949541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.949602 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.949672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:22 crc kubenswrapper[4778]: I0930 17:18:22.949700 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:22Z","lastTransitionTime":"2025-09-30T17:18:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.053807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.053873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.053896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.053947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.053976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.157189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.157232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.157244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.157263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.157276 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.260910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.261002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.261020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.261049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.261070 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.365529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.365610 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.365648 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.365675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.365702 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.469242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.469306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.469319 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.469343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.469360 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.572643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.573057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.573150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.573257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.573364 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.676504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.677128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.677277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.677417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.677534 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.714086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.714208 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:23 crc kubenswrapper[4778]: E0930 17:18:23.714951 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:23 crc kubenswrapper[4778]: E0930 17:18:23.715062 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.781131 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.781176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.781185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.781208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.781220 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.884364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.884466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.884501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.884532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.884558 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.987352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.987409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.987427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.987453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:23 crc kubenswrapper[4778]: I0930 17:18:23.987471 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:23Z","lastTransitionTime":"2025-09-30T17:18:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.091104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.091190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.091212 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.091236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.091254 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.193985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.194078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.194105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.194142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.194165 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.297369 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.297443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.297466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.297497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.297514 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.400529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.400580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.400592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.400633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.400647 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.423137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:24 crc kubenswrapper[4778]: E0930 17:18:24.423336 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:24 crc kubenswrapper[4778]: E0930 17:18:24.423419 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:18:32.423395134 +0000 UTC m=+51.413292977 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.503317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.503382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.503399 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.503420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.503432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.606751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.606813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.606824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.606843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.606855 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.710375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.710437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.710456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.710481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.710497 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.713853 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.713853 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:24 crc kubenswrapper[4778]: E0930 17:18:24.714065 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:24 crc kubenswrapper[4778]: E0930 17:18:24.714174 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.813761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.813813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.813829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.813850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.813863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.917157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.917222 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.917236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.917258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:24 crc kubenswrapper[4778]: I0930 17:18:24.917272 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:24Z","lastTransitionTime":"2025-09-30T17:18:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.021017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.021078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.021096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.021125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.021145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.125062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.125130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.125151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.125181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.125204 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.229055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.229132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.229158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.229206 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.229233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.332073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.332166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.332193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.332227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.332247 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.435660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.435721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.435739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.435768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.435788 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.539685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.539745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.539763 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.539787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.539807 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.643950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.644024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.644048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.644082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.644109 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.713999 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.714121 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:25 crc kubenswrapper[4778]: E0930 17:18:25.714326 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:25 crc kubenswrapper[4778]: E0930 17:18:25.714517 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.752549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.752638 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.752659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.752688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.752711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.856259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.856409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.856437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.856471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.856495 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.959790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.959875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.959896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.959926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:25 crc kubenswrapper[4778]: I0930 17:18:25.959946 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:25Z","lastTransitionTime":"2025-09-30T17:18:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.063864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.063908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.063923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.063947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.063965 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.167532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.167600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.167659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.167692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.167716 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.271134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.271219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.271240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.271268 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.271322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.374471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.374539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.374558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.374585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.374603 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.477199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.477259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.477274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.477296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.477312 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.581013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.581110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.581131 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.581155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.581174 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.685729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.685815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.685833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.685866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.685887 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.713442 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.713574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:26 crc kubenswrapper[4778]: E0930 17:18:26.713825 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:26 crc kubenswrapper[4778]: E0930 17:18:26.713969 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.789809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.789870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.789884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.789905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.789923 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.892950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.893011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.893025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.893047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.893063 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.996567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.996634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.996645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.996665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:26 crc kubenswrapper[4778]: I0930 17:18:26.996678 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:26Z","lastTransitionTime":"2025-09-30T17:18:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.100105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.100170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.100184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.100206 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.100221 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.202508 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.202579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.202592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.202611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.202637 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.305761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.305837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.305848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.305868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.305879 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.409138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.409186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.409195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.409213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.409225 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.511885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.511949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.511963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.511985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.511999 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.614588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.614652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.614661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.614679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.614689 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.713539 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.713707 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.713766 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.714025 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.717682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.717729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.717742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.717755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.717766 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.811078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.811168 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.811192 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.811223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.811246 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.831668 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:27Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.837374 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.837478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.837503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.837575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.837604 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.859115 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:27Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.865067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.865142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.865154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.865172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.865184 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.888392 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:27Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.893135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.893181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.893195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.893217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.893233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.909469 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:27Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.913940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.913976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.913985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.914001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.914013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.928865 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:27Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:27 crc kubenswrapper[4778]: E0930 17:18:27.928994 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.930498 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.930530 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.930539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.930552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:27 crc kubenswrapper[4778]: I0930 17:18:27.930563 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:27Z","lastTransitionTime":"2025-09-30T17:18:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.034349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.034412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.034428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.034450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.034471 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.137597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.137682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.137695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.137719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.137735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.240870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.240966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.240987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.241013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.241032 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.345548 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.345655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.345670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.345694 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.345705 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.448723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.448779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.448791 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.448815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.448828 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.551851 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.551925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.551949 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.551977 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.551996 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.626132 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.649562 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.654755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.654816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.654846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.654880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.654907 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.670552 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.693951 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.713602 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.713794 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:28 crc kubenswrapper[4778]: E0930 17:18:28.713839 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:28 crc kubenswrapper[4778]: E0930 17:18:28.714078 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.719928 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.738238 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.758987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.759043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.759062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.759088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.759108 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.761770 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.777465 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.798282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.814038 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.828045 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.858745 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.861989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.862055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.862077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.862104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.862121 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.880012 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.903433 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.921478 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.943585 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.965474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.965548 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.965566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.965599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.965634 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:28Z","lastTransitionTime":"2025-09-30T17:18:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.967374 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:28 crc kubenswrapper[4778]: I0930 17:18:28.989179 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:28Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.068397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.068468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.068481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.068502 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.068518 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.171352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.171443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.171463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.171490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.171506 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.274535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.274593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.274608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.274650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.274666 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.377905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.377956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.377968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.377987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.378001 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.481836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.481917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.481941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.481973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.481996 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.585476 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.585529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.585541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.585561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.585575 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.688873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.688964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.688987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.689019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.689046 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.713298 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.713327 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:29 crc kubenswrapper[4778]: E0930 17:18:29.713493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:29 crc kubenswrapper[4778]: E0930 17:18:29.713808 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.793032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.793111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.793129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.793156 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.793180 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.896090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.896142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.896152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.896173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.896186 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.999783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.999845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.999857 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.999875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:29 crc kubenswrapper[4778]: I0930 17:18:29.999889 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:29Z","lastTransitionTime":"2025-09-30T17:18:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.103297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.103353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.103371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.103401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.103421 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.206720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.206784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.206805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.206831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.206851 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.310899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.310962 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.310980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.311008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.311023 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.414484 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.414541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.414552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.414570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.414580 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.518044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.518146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.518179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.518211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.518232 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.621398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.621436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.621445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.621461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.621476 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.714026 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:30 crc kubenswrapper[4778]: E0930 17:18:30.714185 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.714053 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:30 crc kubenswrapper[4778]: E0930 17:18:30.714365 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.715103 4778 scope.go:117] "RemoveContainer" containerID="549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.724310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.724354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.724365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.724379 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.724389 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.828261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.828580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.828590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.828613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.828649 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.931466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.931521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.931535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.931559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:30 crc kubenswrapper[4778]: I0930 17:18:30.931572 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:30Z","lastTransitionTime":"2025-09-30T17:18:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.034258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.034347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.034372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.034409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.034434 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.098304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/1.log" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.102472 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.102942 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.120528 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.137559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.137670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.137726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.137755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.137775 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.143558 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.158273 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.185671 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.216028 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.235266 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.239848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.239911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.239925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.239943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.239953 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.247797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.260811 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.270603 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.281377 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.292631 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.306484 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.318142 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.332774 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.343136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.343185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.343197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.343214 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.343226 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.347637 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.362541 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.381497 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.446408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.446448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.446458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.446475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.446489 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.548577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.548646 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.548662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.548684 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.548701 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.651720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.651770 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.651781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.651803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.651821 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.713742 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.713805 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:31 crc kubenswrapper[4778]: E0930 17:18:31.713951 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:31 crc kubenswrapper[4778]: E0930 17:18:31.714148 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.734150 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.752214 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.754491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.754542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.754561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.754592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.754614 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.774610 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.795260 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.806530 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.828373 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.841594 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.853165 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.857195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.857234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.857246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.857267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.857282 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.866995 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.896150 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.912385 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.928195 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.945744 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.959947 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.960158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.960218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.960232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.960252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.960266 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:31Z","lastTransitionTime":"2025-09-30T17:18:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.971889 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.984279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:31 crc kubenswrapper[4778]: I0930 17:18:31.996163 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.063003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.063099 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.063118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.063147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.063170 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.109539 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/2.log" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.110648 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/1.log" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.115408 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f" exitCode=1 Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.115485 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.115570 4778 scope.go:117] "RemoveContainer" containerID="549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.116787 4778 scope.go:117] "RemoveContainer" containerID="db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f" Sep 30 17:18:32 crc kubenswrapper[4778]: E0930 17:18:32.117159 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.156235 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.166220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.166279 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.166290 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.166310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.166322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.178907 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.194237 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.215272 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.236491 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.249836 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.264776 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.268776 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.268868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.268910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.268939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.268972 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.279805 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.295969 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.310826 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.328238 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.349746 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://549497f6dfa85ec788a2b9e9c771f70c39a9f2d217d97c863c5c0258b5b9f8e3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989783 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989793 6230 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0930 17:18:15.989798 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0930 17:18:15.989805 6230 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0930 17:18:15.989156 6230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989821 6230 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 17:18:15.989835 6230 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 17:18:15.989840 6230 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 17:18:15.989844 6230 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nF0930 17:18:15.989805 6230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.362699 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.371712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.371771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.371784 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.371816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.371836 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.378181 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.395938 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.416190 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.435082 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:32Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.474315 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.474365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.474374 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.474390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.474402 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.520183 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:32 crc kubenswrapper[4778]: E0930 17:18:32.520419 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:32 crc kubenswrapper[4778]: E0930 17:18:32.520567 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:18:48.520527993 +0000 UTC m=+67.510425836 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.576939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.576995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.577004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.577024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.577035 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.680118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.680166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.680175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.680189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.680199 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.713864 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.714007 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:32 crc kubenswrapper[4778]: E0930 17:18:32.714070 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:32 crc kubenswrapper[4778]: E0930 17:18:32.714185 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.783521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.783593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.783641 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.783671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.783700 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.887131 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.887223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.887242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.887275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.887297 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.990561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.990663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.990681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.990709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:32 crc kubenswrapper[4778]: I0930 17:18:32.990728 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:32Z","lastTransitionTime":"2025-09-30T17:18:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.093946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.094005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.094018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.094036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.094049 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.122334 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/2.log" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.127423 4778 scope.go:117] "RemoveContainer" containerID="db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f" Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.127733 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.142776 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.166441 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.187171 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.198205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.198270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.198288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.198318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.198371 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.208022 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.226684 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.247166 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.268130 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.286295 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.302192 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.302253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.302271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.302296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.302317 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.308066 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.332976 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.354493 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.372765 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.389584 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.404909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.404967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.404988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.405010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.405022 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.413490 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.442324 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.466187 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.485204 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:33Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.507940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.508005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.508020 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.508045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.508060 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.611197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.611291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.611324 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.611357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.611381 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.634191 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.634369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.634490 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:19:05.634452921 +0000 UTC m=+84.624350764 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.634600 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.634684 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.634712 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.634832 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:19:05.634777891 +0000 UTC m=+84.624675744 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.634826 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.634947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.635025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635092 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635178 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:19:05.635154123 +0000 UTC m=+84.625051986 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635213 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635325 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:19:05.635296788 +0000 UTC m=+84.625194631 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635340 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635375 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635398 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.635460 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:19:05.635439873 +0000 UTC m=+84.625337786 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.713229 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.713347 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.713572 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:33 crc kubenswrapper[4778]: E0930 17:18:33.719047 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.719529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.719570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.719600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.719654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.719678 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.823346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.823409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.823426 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.823448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.823462 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.926417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.926485 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.926504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.926535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:33 crc kubenswrapper[4778]: I0930 17:18:33.926559 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:33Z","lastTransitionTime":"2025-09-30T17:18:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.030166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.030246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.030269 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.030300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.030322 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.132602 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.132735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.132753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.132780 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.132800 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.235318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.235366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.235376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.235395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.235410 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.338601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.338681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.338692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.338711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.338724 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.441254 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.441326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.441348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.441378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.441400 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.545401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.545468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.545491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.545521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.545543 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.648951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.649024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.649048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.649084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.649112 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.713703 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.713770 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:34 crc kubenswrapper[4778]: E0930 17:18:34.713963 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:34 crc kubenswrapper[4778]: E0930 17:18:34.714075 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.752575 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.752682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.752701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.752732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.752752 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.855120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.855185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.855200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.855217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.855229 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.959188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.959262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.959280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.959306 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:34 crc kubenswrapper[4778]: I0930 17:18:34.959324 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:34Z","lastTransitionTime":"2025-09-30T17:18:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.062505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.062564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.062583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.062611 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.062667 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.165495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.165772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.165839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.165945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.166020 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.270105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.270428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.270513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.270588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.270666 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.373843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.374177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.374356 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.374424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.374510 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.476841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.476893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.476909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.476931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.476946 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.579941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.579995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.580005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.580022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.580033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.683856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.684533 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.684601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.684698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.684790 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.713299 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.713423 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:35 crc kubenswrapper[4778]: E0930 17:18:35.713661 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:35 crc kubenswrapper[4778]: E0930 17:18:35.713941 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.787769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.787869 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.787883 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.787908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.787928 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.891066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.891140 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.891164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.891199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.891220 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.994309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.994415 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.994440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.994476 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:35 crc kubenswrapper[4778]: I0930 17:18:35.994500 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:35Z","lastTransitionTime":"2025-09-30T17:18:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.097685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.098156 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.098309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.098475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.098664 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.201651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.201717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.201734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.201754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.201768 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.305427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.306144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.306307 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.306462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.306594 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.410244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.410593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.410677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.410747 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.410803 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.513469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.513816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.513872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.513896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.513910 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.616837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.616903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.616918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.616939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.616954 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.713517 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:36 crc kubenswrapper[4778]: E0930 17:18:36.713846 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.714181 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:36 crc kubenswrapper[4778]: E0930 17:18:36.714326 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.720239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.720349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.720368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.720398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.720420 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.824392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.824446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.824460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.824478 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.824491 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.927785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.927850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.927874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.927906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:36 crc kubenswrapper[4778]: I0930 17:18:36.927932 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:36Z","lastTransitionTime":"2025-09-30T17:18:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.030884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.030967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.030980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.031001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.031016 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.134132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.134642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.134783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.134963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.135107 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.238297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.238677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.238761 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.238848 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.238916 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.342293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.342352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.342368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.342391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.342408 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.446512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.446648 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.446672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.446703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.446723 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.501814 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.518582 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.524363 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.543957 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.550693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.550765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.550779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.550803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.550821 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.562462 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.583534 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.602386 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.629058 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.644289 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.654255 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.654329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.654349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.654382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.654404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.660111 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.675483 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.694040 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.713052 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.713163 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:37 crc kubenswrapper[4778]: E0930 17:18:37.713798 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:37 crc kubenswrapper[4778]: E0930 17:18:37.713808 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.722205 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.740650 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.751303 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.756512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.756868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.756963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.757059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.757126 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.769031 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.786810 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.802602 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.816931 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:37Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.860487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.860532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.860544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.860566 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.860580 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.963008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.963055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.963066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.963083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:37 crc kubenswrapper[4778]: I0930 17:18:37.963094 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:37Z","lastTransitionTime":"2025-09-30T17:18:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.066371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.066421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.066435 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.066456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.066470 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.137146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.137216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.137236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.137265 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.137285 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.157916 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:38Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.163198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.163257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.163278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.163303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.163338 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.182965 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:38Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.188451 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.188511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.188522 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.188567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.188582 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.203852 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:38Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.208690 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.208728 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.208738 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.208757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.208768 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.223498 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:38Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.228229 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.228272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.228285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.228303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.228319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.249897 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:38Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.250044 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.254395 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.254436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.254448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.254469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.254487 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.357981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.358040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.358058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.358087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.358134 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.461532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.461601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.461662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.461696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.461716 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.565480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.565542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.565558 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.565577 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.565591 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.669415 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.669508 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.669526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.669547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.669563 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.713337 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.713355 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.713579 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:38 crc kubenswrapper[4778]: E0930 17:18:38.713765 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.773189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.773262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.773284 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.773312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.773369 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.876972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.877052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.877076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.877108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.877129 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.980794 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.980896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.981019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.981140 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:38 crc kubenswrapper[4778]: I0930 17:18:38.981177 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:38Z","lastTransitionTime":"2025-09-30T17:18:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.086938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.087002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.087016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.087039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.087054 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.190656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.190713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.190729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.190752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.190768 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.294183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.294271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.294297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.294331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.294356 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.397696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.397748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.397758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.397781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.397793 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.501123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.501179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.501191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.501215 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.501230 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.604158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.604221 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.604236 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.604262 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.604276 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.707854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.707919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.707938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.707964 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.707984 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.713385 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.713511 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:39 crc kubenswrapper[4778]: E0930 17:18:39.713574 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:39 crc kubenswrapper[4778]: E0930 17:18:39.713926 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.811010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.811076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.811095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.811118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.811134 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.913704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.913769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.913786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.913808 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:39 crc kubenswrapper[4778]: I0930 17:18:39.913823 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:39Z","lastTransitionTime":"2025-09-30T17:18:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.016398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.016477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.016497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.016528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.016547 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.119901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.119967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.119981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.120004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.120020 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.223068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.223462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.223597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.223856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.224072 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.327563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.328283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.328474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.328666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.328806 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.431487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.431560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.431578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.431604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.431658 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.535002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.535046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.535056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.535076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.535088 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.638076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.638171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.638194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.638230 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.638286 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.713987 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.714076 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:40 crc kubenswrapper[4778]: E0930 17:18:40.714533 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:40 crc kubenswrapper[4778]: E0930 17:18:40.714719 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.741117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.741184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.741196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.741218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.741233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.845414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.845465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.845481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.845515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.845532 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.949200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.949280 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.949295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.949353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:40 crc kubenswrapper[4778]: I0930 17:18:40.949406 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:40Z","lastTransitionTime":"2025-09-30T17:18:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.052424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.052516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.052549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.052585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.052608 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.156772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.156841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.156865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.156897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.156922 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.260716 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.260809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.260842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.260882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.260902 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.365047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.365098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.365110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.365130 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.365145 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.468596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.468686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.468704 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.468732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.468753 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.572590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.572732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.572759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.572799 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.572837 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.677439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.678042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.678056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.678078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.678092 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.713114 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.713114 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:41 crc kubenswrapper[4778]: E0930 17:18:41.713370 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:41 crc kubenswrapper[4778]: E0930 17:18:41.713598 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.734506 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.755174 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.781297 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.782238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.782313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.782334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.782363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.782382 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.802091 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.830669 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.852730 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.876650 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.886008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.886061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.886072 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.886091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.886104 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.891898 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.913298 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.932196 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.946964 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.962527 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.974773 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.989416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.989459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.989472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.989492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.989505 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:41Z","lastTransitionTime":"2025-09-30T17:18:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:41 crc kubenswrapper[4778]: I0930 17:18:41.990658 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:41Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.002069 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.018364 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.033456 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.051029 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:42Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.093016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.093071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.093084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.093105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.093119 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.196712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.196777 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.196795 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.196821 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.196839 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.300121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.300171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.300189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.300219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.300231 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.403954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.404023 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.404033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.404054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.404066 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.507139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.507235 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.507252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.507281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.507300 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.610390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.610467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.610482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.610503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.610521 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.713395 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.713495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:42 crc kubenswrapper[4778]: E0930 17:18:42.713733 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:42 crc kubenswrapper[4778]: E0930 17:18:42.713972 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.714154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.714203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.714217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.714237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.714250 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.816973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.817047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.817059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.817080 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.817102 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.920453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.920501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.920513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.920534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:42 crc kubenswrapper[4778]: I0930 17:18:42.920551 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:42Z","lastTransitionTime":"2025-09-30T17:18:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.022912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.022966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.022976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.022992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.023004 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.126117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.126172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.126184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.126201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.126212 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.229139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.229213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.229232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.229261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.229283 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.333259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.333309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.333326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.333353 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.333367 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.436699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.436746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.436756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.436778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.436794 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.539764 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.539832 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.539850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.539879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.539905 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.643504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.643564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.643582 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.643608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.643654 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.713949 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.714043 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:43 crc kubenswrapper[4778]: E0930 17:18:43.714225 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:43 crc kubenswrapper[4778]: E0930 17:18:43.714381 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.747526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.747579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.747591 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.747612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.747643 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.851595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.851688 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.851702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.851721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.851735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.955346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.955440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.955477 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.955510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:43 crc kubenswrapper[4778]: I0930 17:18:43.955534 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:43Z","lastTransitionTime":"2025-09-30T17:18:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.059061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.059128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.059148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.059178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.059198 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.162208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.162276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.162295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.162326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.162344 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.266167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.266228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.266248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.266278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.266303 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.371155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.371307 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.371364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.371389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.371409 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.474905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.474994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.475013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.475042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.475064 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.581968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.582028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.582040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.582055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.582065 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.684192 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.684261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.684271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.684285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.684296 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.712948 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.713420 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:44 crc kubenswrapper[4778]: E0930 17:18:44.713551 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:44 crc kubenswrapper[4778]: E0930 17:18:44.713837 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.787427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.787510 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.787523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.787548 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.787563 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.891386 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.891451 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.891470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.891497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.891517 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.994701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.994746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.994756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.994772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:44 crc kubenswrapper[4778]: I0930 17:18:44.994782 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:44Z","lastTransitionTime":"2025-09-30T17:18:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.099758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.099830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.099850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.099878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.099898 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.202499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.202559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.202576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.202600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.202656 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.305599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.305661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.305673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.305694 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.305706 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.409326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.409386 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.409403 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.409424 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.409437 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.512771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.512835 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.512847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.512866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.512876 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.616339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.616434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.616459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.616492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.616515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.713973 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.714080 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:45 crc kubenswrapper[4778]: E0930 17:18:45.714832 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:45 crc kubenswrapper[4778]: E0930 17:18:45.714958 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.715524 4778 scope.go:117] "RemoveContainer" containerID="db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f" Sep 30 17:18:45 crc kubenswrapper[4778]: E0930 17:18:45.715930 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.719481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.719535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.719549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.719572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.719588 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.822982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.823047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.823065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.823091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.823111 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.926105 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.926188 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.926206 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.926240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:45 crc kubenswrapper[4778]: I0930 17:18:45.926258 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:45Z","lastTransitionTime":"2025-09-30T17:18:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.029345 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.029411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.029425 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.029449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.029464 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.132556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.132608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.132649 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.132672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.132687 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.236277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.236409 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.236433 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.236462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.236481 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.339792 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.339893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.339919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.339950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.339970 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.442937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.443010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.443032 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.443082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.443106 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.546342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.546397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.546414 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.546437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.546455 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.649948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.649998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.650014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.650038 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.650053 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.713310 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.713308 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:46 crc kubenswrapper[4778]: E0930 17:18:46.713535 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:46 crc kubenswrapper[4778]: E0930 17:18:46.713651 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.754021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.754108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.754133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.754170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.754195 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.858057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.858136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.858158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.858189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.858212 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.961423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.961486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.961500 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.961533 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:46 crc kubenswrapper[4778]: I0930 17:18:46.961549 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:46Z","lastTransitionTime":"2025-09-30T17:18:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.065298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.065368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.065387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.065448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.065468 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.168439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.168816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.168912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.169015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.169081 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.271407 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.271475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.271494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.271522 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.271541 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.374037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.374083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.374094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.374112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.374124 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.478123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.478211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.478240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.478276 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.478296 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.581313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.581362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.581373 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.581390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.581401 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.684131 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.684177 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.684186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.684205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.684221 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.713056 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.713130 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:47 crc kubenswrapper[4778]: E0930 17:18:47.713269 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:47 crc kubenswrapper[4778]: E0930 17:18:47.713398 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.787382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.787465 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.787484 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.787512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.787534 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.890941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.891012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.891030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.891059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.891079 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.994238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.994312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.994330 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.994357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:47 crc kubenswrapper[4778]: I0930 17:18:47.994376 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:47Z","lastTransitionTime":"2025-09-30T17:18:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.097874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.097961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.097981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.098007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.098025 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.201900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.201978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.201989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.202006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.202017 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.288239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.288292 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.288305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.288329 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.288344 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.304111 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:48Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.310187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.310252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.310266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.310289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.310303 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.325334 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:48Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.330281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.330323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.330335 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.330354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.330365 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.345056 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:48Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.353261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.353304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.353316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.353334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.353346 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.367679 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:48Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.372109 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.372231 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.372243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.372263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.372273 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.386930 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:48Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.387042 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.388884 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.388922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.388934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.388950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.388964 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.491277 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.491323 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.491337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.491360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.491375 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.594701 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.594767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.594785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.594812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.594832 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.617247 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.617477 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.617605 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:19:20.617573842 +0000 UTC m=+99.607471835 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.698012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.698082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.698102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.698129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.698148 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.713225 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.713247 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.713406 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:48 crc kubenswrapper[4778]: E0930 17:18:48.713528 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.801519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.801559 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.801569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.801584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.801595 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.904524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.904585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.904598 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.904639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:48 crc kubenswrapper[4778]: I0930 17:18:48.904650 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:48Z","lastTransitionTime":"2025-09-30T17:18:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.007702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.007772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.007786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.007806 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.007821 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.110515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.110576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.110594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.110647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.110724 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.213847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.213896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.213905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.213924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.213934 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.317010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.317060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.317071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.317089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.317102 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.419444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.419495 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.419509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.419529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.419544 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.523069 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.523109 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.523120 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.523138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.523151 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.626609 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.626735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.626754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.626783 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.626800 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.713398 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.713465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:49 crc kubenswrapper[4778]: E0930 17:18:49.713561 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:49 crc kubenswrapper[4778]: E0930 17:18:49.713672 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.730636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.730670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.730679 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.730695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.730726 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.834116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.834180 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.834193 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.834217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.834237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.937216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.937264 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.937294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.937313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:49 crc kubenswrapper[4778]: I0930 17:18:49.937323 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:49Z","lastTransitionTime":"2025-09-30T17:18:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.040217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.040286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.040305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.040331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.040348 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.143054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.143151 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.143165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.143184 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.143199 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.194353 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/0.log" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.194410 4778 generic.go:334] "Generic (PLEG): container finished" podID="99e0ced4-d228-4bfa-a263-b8934f0d8e5d" containerID="7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233" exitCode=1 Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.194448 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerDied","Data":"7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.194899 4778 scope.go:117] "RemoveContainer" containerID="7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.210341 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.227141 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.243340 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.245742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.245895 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.245994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.246090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.246191 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.256802 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.269869 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.282424 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.305203 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.323777 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.338524 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.349326 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.349639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.349748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.349850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.350017 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.352047 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.366518 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.378797 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.392680 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.408166 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.423312 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.438916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.453295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.453365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.453381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.453405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.453419 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.460016 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.482843 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:50Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.556866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.556916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.556928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.556947 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.556962 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.660238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.660310 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.660322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.660345 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.660362 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.713277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.713310 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:50 crc kubenswrapper[4778]: E0930 17:18:50.713406 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:50 crc kubenswrapper[4778]: E0930 17:18:50.713508 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.762892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.762981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.763003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.763033 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.763052 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.866118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.866228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.866252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.866285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.866309 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.969132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.969210 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.969234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.969266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:50 crc kubenswrapper[4778]: I0930 17:18:50.969290 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:50Z","lastTransitionTime":"2025-09-30T17:18:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.072047 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.072092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.072104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.072124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.072136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.175043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.175091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.175104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.175122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.175136 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.200995 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/0.log" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.201070 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerStarted","Data":"b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.219834 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.245030 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.263783 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.277882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.277938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.277950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.277969 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.277982 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.282231 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.296974 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.311026 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.326322 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.338894 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.352980 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.366387 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.381971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.382031 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.382046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.382066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.382078 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.382848 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.403917 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.419901 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.432383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.456763 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.470436 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.485384 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.485432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.485446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.485470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.485485 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.491440 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.506425 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.588817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.588864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.588875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.588893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.588905 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.692354 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.692443 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.692462 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.692492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.692515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.713669 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.713676 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:51 crc kubenswrapper[4778]: E0930 17:18:51.713840 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:51 crc kubenswrapper[4778]: E0930 17:18:51.714000 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.729112 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.741336 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.754945 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.768009 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.780926 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.795016 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.795087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.795100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.795142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.795154 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.795229 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.809551 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.826657 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.844856 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.859676 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.882904 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.898001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.898048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.898063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.898083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.898098 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:51Z","lastTransitionTime":"2025-09-30T17:18:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.899009 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.912829 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.926199 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.940704 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.958700 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.972821 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:51 crc kubenswrapper[4778]: I0930 17:18:51.994205 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:51Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.000453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.000525 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.000541 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.000563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.000577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.103578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.103681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.103694 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.103717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.103730 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.206169 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.206219 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.206228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.206246 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.206257 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.309129 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.309176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.309189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.309209 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.309226 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.416405 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.416446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.416458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.416481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.416496 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.519860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.519923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.519937 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.519953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.519968 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.623391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.623448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.623466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.623491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.623509 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.713537 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:52 crc kubenswrapper[4778]: E0930 17:18:52.713781 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.713949 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:52 crc kubenswrapper[4778]: E0930 17:18:52.714138 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.726266 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.726294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.726305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.726321 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.726331 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.830116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.830191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.830204 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.830228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.830246 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.933909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.933986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.934005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.934034 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:52 crc kubenswrapper[4778]: I0930 17:18:52.934059 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:52Z","lastTransitionTime":"2025-09-30T17:18:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.037014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.037088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.037125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.037161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.037186 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.139874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.139941 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.139961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.139987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.140009 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.243073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.243155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.243172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.243197 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.243218 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.346410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.346460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.346476 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.346503 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.346520 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.449554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.449601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.449631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.449650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.449662 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.552835 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.552917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.552938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.552966 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.552986 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.655812 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.655861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.655870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.655886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.655897 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.714270 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.714320 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:53 crc kubenswrapper[4778]: E0930 17:18:53.714557 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:53 crc kubenswrapper[4778]: E0930 17:18:53.714745 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.758867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.758932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.758945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.758965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.758980 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.862888 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.862960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.862974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.862998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.863014 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.966459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.966507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.966529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.966550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:53 crc kubenswrapper[4778]: I0930 17:18:53.966563 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:53Z","lastTransitionTime":"2025-09-30T17:18:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.069492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.069816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.069963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.070103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.070244 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.173810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.173850 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.173861 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.173878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.173888 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.276317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.276408 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.276426 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.276456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.276480 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.379863 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.379920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.379933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.379955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.379969 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.483189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.483271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.483289 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.483325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.483345 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.586524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.586567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.586576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.586592 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.586608 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.689635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.689680 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.689693 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.689712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.689735 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.713556 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.713600 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:54 crc kubenswrapper[4778]: E0930 17:18:54.713755 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:54 crc kubenswrapper[4778]: E0930 17:18:54.713928 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.793564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.793661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.793677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.793702 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.793718 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.896444 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.896481 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.896490 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.896505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.896516 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.999844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.999893 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:54 crc kubenswrapper[4778]: I0930 17:18:54.999905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:54.999925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:54.999938 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:54Z","lastTransitionTime":"2025-09-30T17:18:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.102737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.102803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.102820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.102845 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.102863 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.206361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.206445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.206470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.206501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.206525 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.309567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.309634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.309647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.309664 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.309675 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.412583 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.412647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.412658 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.412674 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.412685 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.515837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.515898 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.515922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.515953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.515976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.618318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.618391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.618402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.618418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.618428 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.713424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.713424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:55 crc kubenswrapper[4778]: E0930 17:18:55.713555 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:55 crc kubenswrapper[4778]: E0930 17:18:55.713608 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.719733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.719778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.719786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.719800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.719809 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.822838 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.822903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.822917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.822934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.822948 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.926205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.926242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.926256 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.926274 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:55 crc kubenswrapper[4778]: I0930 17:18:55.926288 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:55Z","lastTransitionTime":"2025-09-30T17:18:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.029348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.029415 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.029428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.029447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.029459 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.131792 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.131833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.131843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.131860 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.131871 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.235247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.235291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.235301 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.235322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.235333 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.337661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.337710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.337720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.337737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.337747 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.439865 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.439931 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.439950 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.439974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.439991 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.543114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.543167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.543179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.543198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.543213 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.646202 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.646261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.646275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.646298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.646316 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.713087 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:56 crc kubenswrapper[4778]: E0930 17:18:56.713267 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.713509 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:56 crc kubenswrapper[4778]: E0930 17:18:56.713592 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.749359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.749401 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.749410 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.749425 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.749498 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.852150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.852182 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.852191 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.852205 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.852217 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.956042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.956088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.956100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.956116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:56 crc kubenswrapper[4778]: I0930 17:18:56.956127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:56Z","lastTransitionTime":"2025-09-30T17:18:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.059050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.059100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.059111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.059137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.059151 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.162676 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.162710 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.162719 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.162734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.162745 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.264934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.264978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.264987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.265003 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.265017 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.367925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.367969 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.367980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.367997 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.368010 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.470505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.470578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.470596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.470629 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.470644 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.573222 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.573267 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.573283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.573300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.573311 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.675328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.675366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.675375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.675390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.675400 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.713120 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.713181 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:57 crc kubenswrapper[4778]: E0930 17:18:57.713248 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:57 crc kubenswrapper[4778]: E0930 17:18:57.713310 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.777797 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.777834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.777842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.777856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.777865 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.880694 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.880735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.880743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.880757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.880767 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.986259 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.986832 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.987028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.987059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:57 crc kubenswrapper[4778]: I0930 17:18:57.987073 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:57Z","lastTransitionTime":"2025-09-30T17:18:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.089731 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.089771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.089779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.089793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.089803 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.192374 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.192420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.192428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.192445 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.192455 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.294753 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.294790 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.294798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.294813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.294823 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.397556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.397603 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.397633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.397652 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.397663 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.500560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.500633 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.500647 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.500665 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.500678 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.602843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.602910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.602925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.602943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.602955 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.705585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.705671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.705681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.705698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.705711 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.713872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.714054 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.714161 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.714266 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.751874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.751926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.751936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.751951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.751961 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.768788 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.773643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.773692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.773711 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.773734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.773747 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.788961 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.792782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.792847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.793066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.793082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.793094 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.806884 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.811152 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.811223 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.811234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.811252 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.811266 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.822827 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.827078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.827112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.827121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.827137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.827164 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.841046 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:18:58Z is after 2025-08-24T17:21:41Z" Sep 30 17:18:58 crc kubenswrapper[4778]: E0930 17:18:58.841252 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.843025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.843062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.843075 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.843092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.843105 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.945754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.945801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.945819 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.945841 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:58 crc kubenswrapper[4778]: I0930 17:18:58.945866 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:58Z","lastTransitionTime":"2025-09-30T17:18:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.048818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.048872 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.048885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.048903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.048916 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.151293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.151331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.151341 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.151376 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.151387 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.253985 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.254056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.254077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.254098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.254111 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.357160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.357202 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.357218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.357237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.357251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.460447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.460506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.460527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.460547 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.460562 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.563232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.563288 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.563299 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.563317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.563332 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.666127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.666156 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.666166 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.666182 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.666194 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.713986 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.714207 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:18:59 crc kubenswrapper[4778]: E0930 17:18:59.714414 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:18:59 crc kubenswrapper[4778]: E0930 17:18:59.715141 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.715548 4778 scope.go:117] "RemoveContainer" containerID="db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.774779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.774853 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.774866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.774885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.774897 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.878048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.878113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.878127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.878148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.878161 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.980729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.980945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.981008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.981026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:18:59 crc kubenswrapper[4778]: I0930 17:18:59.981037 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:18:59Z","lastTransitionTime":"2025-09-30T17:18:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.083981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.084028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.084041 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.084061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.084074 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.187139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.187170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.187179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.187194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.187203 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.238772 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/2.log" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.242018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.242460 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.255980 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.265640 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.278283 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.289460 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.289992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.290027 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.290040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.290057 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.290071 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.299588 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.309473 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.323549 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.334660 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.348024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.364024 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.376135 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.393311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.393350 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.393360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.393375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.393386 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.396747 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.409441 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.422784 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.434582 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.448268 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.465656 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.477272 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:00Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.496521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.496567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.496581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.496597 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.496611 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.598967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.599012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.599025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.599046 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.599060 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.702491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.702549 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.702562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.702584 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.702656 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.714015 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.714018 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:00 crc kubenswrapper[4778]: E0930 17:19:00.714153 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:00 crc kubenswrapper[4778]: E0930 17:19:00.714324 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.806278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.806322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.806334 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.806352 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.806367 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.908371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.908404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.908440 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.908463 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:00 crc kubenswrapper[4778]: I0930 17:19:00.908474 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:00Z","lastTransitionTime":"2025-09-30T17:19:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.011388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.011449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.011466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.011489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.011508 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.114228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.114271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.114281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.114296 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.114307 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.217757 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.217816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.217832 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.217854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.217870 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.247501 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/3.log" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.248354 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/2.log" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.252535 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" exitCode=1 Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.252632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.252720 4778 scope.go:117] "RemoveContainer" containerID="db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.254200 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:19:01 crc kubenswrapper[4778]: E0930 17:19:01.254516 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.273493 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.286073 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.301282 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.320007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.320052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.320066 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.320084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.320097 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.320279 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:19:00Z\\\",\\\"message\\\":\\\" *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:19:00.501933 6817 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.501997 6817 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502069 6817 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502095 6817 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502147 6817 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502288 6817 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.514772 6817 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:19:00.514787 6817 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:19:00.514860 6817 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:19:00.514885 6817 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:19:00.514987 6817 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.332965 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.346785 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.360146 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.377011 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.388955 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.401483 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.413307 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.422916 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.422976 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.422990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.423011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.423023 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.425376 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.439961 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.451717 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.464078 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.475752 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.495735 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.510256 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.525955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.525993 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.526001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.526015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.526025 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.628847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.628892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.628901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.628917 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.628929 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.713142 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.713217 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:01 crc kubenswrapper[4778]: E0930 17:19:01.713313 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:01 crc kubenswrapper[4778]: E0930 17:19:01.713466 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.732856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.732892 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.732905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.732919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.732929 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.736074 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://db35bbe73d927f932166ead662e46d6102c3215936229c0fb99ac4d01d14af7f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:31Z\\\",\\\"message\\\":\\\"time 2025-09-30T17:18:31Z is after 2025-08-24T17:21:41Z]\\\\nI0930 17:18:31.613135 6451 services_controller.go:444] Built service default/kubernetes LB per-node configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.1\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:443, clusterEndpoints:services.lbEndpoints{Port:6443, V4IPs:[]string{\\\\\\\"192.168.126.11\\\\\\\"}, V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nI0930 17:18:31.613156 6451 services_controller.go:445] Built service default/kubernetes LB template configs for network=default: []services.lbConfig(nil)\\\\nI0930 17:18:31.613139 6451 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-machine-api/control-plane-machine-set-operator_TCP_cluster\\\\\\\", UUID:\\\\\\\"589f95f7-f3e2-4140-80ed-9a0717201481\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-machine-api/control-plane-machine-set-operator\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:19:00Z\\\",\\\"message\\\":\\\" *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:19:00.501933 6817 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.501997 6817 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502069 6817 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502095 6817 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502147 6817 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502288 6817 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.514772 6817 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:19:00.514787 6817 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:19:00.514860 6817 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:19:00.514885 6817 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:19:00.514987 6817 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.750518 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.762783 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.775937 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.791526 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.799814 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.810545 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.818845 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.834806 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.835096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.835133 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.835143 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.835160 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.835174 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.847266 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.858577 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.871441 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.882385 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.901028 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.913353 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.926919 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.937908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.937961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.937995 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.938018 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.938038 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:01Z","lastTransitionTime":"2025-09-30T17:19:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.938922 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:01 crc kubenswrapper[4778]: I0930 17:19:01.949454 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:01Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.040968 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.041019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.041030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.041049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.041062 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.143977 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.144029 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.144040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.144055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.144068 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.247194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.247631 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.247643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.247659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.247673 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.258108 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/3.log" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.262714 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:19:02 crc kubenswrapper[4778]: E0930 17:19:02.262870 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.275824 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.289265 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.303854 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.323970 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:19:00Z\\\",\\\"message\\\":\\\" *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:19:00.501933 6817 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.501997 6817 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502069 6817 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502095 6817 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502147 6817 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502288 6817 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.514772 6817 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:19:00.514787 6817 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:19:00.514860 6817 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:19:00.514885 6817 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:19:00.514987 6817 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.339094 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.350921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.350960 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.350973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.350989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.351001 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.354711 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.370629 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.383803 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.393818 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.406034 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.422855 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.435029 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.445961 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.453855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.453896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.453908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.453924 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.453937 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.463323 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.476273 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.488275 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.506488 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.519875 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:02Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.557362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.557407 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.557418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.557434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.557446 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.660315 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.660363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.660375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.660389 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.660399 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.713196 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:02 crc kubenswrapper[4778]: E0930 17:19:02.713341 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.713443 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:02 crc kubenswrapper[4778]: E0930 17:19:02.713735 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.763002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.763107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.763134 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.763167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.763189 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.866375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.866468 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.866492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.866524 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.866544 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.968691 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.968746 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.968756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.968772 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:02 crc kubenswrapper[4778]: I0930 17:19:02.968781 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:02Z","lastTransitionTime":"2025-09-30T17:19:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.072373 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.072420 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.072432 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.072449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.072461 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.175509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.175553 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.175625 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.175643 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.175655 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.277521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.277564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.277573 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.277588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.277600 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.380300 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.380337 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.380346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.380361 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.380370 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.483456 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.483501 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.483513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.483529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.483541 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.586380 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.586429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.586439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.586455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.586467 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.689125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.689196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.689210 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.689229 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.689243 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.713982 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.713988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:03 crc kubenswrapper[4778]: E0930 17:19:03.714185 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:03 crc kubenswrapper[4778]: E0930 17:19:03.714272 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.792007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.792048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.792059 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.792074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.792086 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.894815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.894870 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.894886 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.894910 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.894928 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.998081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.998148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.998165 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.998190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:03 crc kubenswrapper[4778]: I0930 17:19:03.998208 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:03Z","lastTransitionTime":"2025-09-30T17:19:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.101453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.101514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.101527 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.101543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.101556 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.205117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.205181 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.205199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.205226 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.205244 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.308550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.308657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.308684 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.308717 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.308740 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.412464 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.412540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.412557 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.412585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.412605 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.516505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.516596 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.516663 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.516692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.516718 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.619434 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.619507 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.619520 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.619544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.619578 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.713866 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.713866 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:04 crc kubenswrapper[4778]: E0930 17:19:04.714208 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:04 crc kubenswrapper[4778]: E0930 17:19:04.714507 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.722817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.722900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.722922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.722946 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.722965 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.826377 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.826428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.826453 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.826472 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.826483 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.929285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.929333 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.929347 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.929364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:04 crc kubenswrapper[4778]: I0930 17:19:04.929378 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:04Z","lastTransitionTime":"2025-09-30T17:19:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.031778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.031815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.031823 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.031840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.031849 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.133650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.133712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.133726 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.133745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.133760 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.236459 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.236509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.236521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.236540 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.236557 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.339973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.340039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.340052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.340072 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.340087 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.442730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.442786 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.442798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.442815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.442828 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.545672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.545735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.545756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.545778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.545794 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.648553 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.648600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.648613 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.648654 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.648667 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.703862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.704014 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.704065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704191 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.704151881 +0000 UTC m=+148.694049734 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704229 4778 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.704296 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704343 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.704313077 +0000 UTC m=+148.694210950 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704244 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.704374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704398 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704422 4778 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704469 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.704455942 +0000 UTC m=+148.694353755 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704542 4778 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704660 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.704635598 +0000 UTC m=+148.694533401 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704680 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704711 4778 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704728 4778 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.704798 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.704780833 +0000 UTC m=+148.694678676 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.713053 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.713073 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.713421 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:05 crc kubenswrapper[4778]: E0930 17:19:05.713540 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.724720 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.751974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.752055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.752101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.752155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.752187 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.855355 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.855417 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.855427 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.855448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.855713 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.957920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.957965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.957974 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.957992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:05 crc kubenswrapper[4778]: I0930 17:19:05.958001 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:05Z","lastTransitionTime":"2025-09-30T17:19:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.060561 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.060604 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.060639 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.060657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.060668 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.164136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.164232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.164272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.164312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.164336 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.267803 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.267866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.267877 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.267894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.267908 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.369920 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.369990 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.370001 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.370025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.370041 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.473021 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.473078 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.473095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.473112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.473125 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.577081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.577145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.577164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.577190 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.577211 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.679854 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.679908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.679922 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.679943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.679958 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.713798 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.713966 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:06 crc kubenswrapper[4778]: E0930 17:19:06.714096 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:06 crc kubenswrapper[4778]: E0930 17:19:06.714244 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.782291 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.782331 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.782342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.782357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.782367 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.885064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.885112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.885128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.885147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.885158 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.987734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.987779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.987789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.987805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:06 crc kubenswrapper[4778]: I0930 17:19:06.987817 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:06Z","lastTransitionTime":"2025-09-30T17:19:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.090448 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.090512 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.090529 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.090556 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.090577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.192727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.192813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.192833 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.192866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.192886 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.294868 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.294913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.294925 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.294942 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.294951 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.398112 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.398174 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.398186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.398204 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.398218 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.501372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.501449 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.501470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.501498 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.501515 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.604312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.604366 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.604382 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.604402 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.604415 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.708800 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.708866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.708879 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.708901 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.708917 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.713286 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.713575 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:07 crc kubenswrapper[4778]: E0930 17:19:07.713774 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:07 crc kubenswrapper[4778]: E0930 17:19:07.713830 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.811805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.811878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.811911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.811951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.811976 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.915379 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.915447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.915470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.915505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:07 crc kubenswrapper[4778]: I0930 17:19:07.915530 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:07Z","lastTransitionTime":"2025-09-30T17:19:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.018284 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.018339 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.018348 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.018364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.018375 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.121022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.121095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.121118 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.121147 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.121169 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.225049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.225104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.225113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.225132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.225144 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.328579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.328745 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.328774 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.328827 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.328854 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.431519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.431569 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.431581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.431601 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.431638 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.534170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.534217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.534227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.534242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.534253 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.638742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.638807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.638820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.638843 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.638857 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.713988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.714073 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:08 crc kubenswrapper[4778]: E0930 17:19:08.714270 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:08 crc kubenswrapper[4778]: E0930 17:19:08.714562 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.741343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.741393 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.741404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.741421 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.741434 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.843961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.844004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.844015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.844096 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.844108 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.946793 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.946840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.946856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.946875 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:08 crc kubenswrapper[4778]: I0930 17:19:08.946894 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:08Z","lastTransitionTime":"2025-09-30T17:19:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.048982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.049037 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.049050 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.049068 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.049080 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.131196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.131238 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.131249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.131263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.131272 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.146933 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.151514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.151544 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.151553 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.151659 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.151673 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.174232 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.180519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.180579 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.180590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.180606 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.180639 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.205752 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.210437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.210483 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.210494 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.210509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.210521 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.224983 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.228923 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.228981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.228992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.229008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.229032 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.243244 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:09Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.243360 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.244900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.244927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.244938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.244952 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.244961 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.348036 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.348089 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.348097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.348113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.348126 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.451115 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.451391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.451475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.451572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.451723 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.554122 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.554195 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.554218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.554244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.554262 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.657006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.657062 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.657074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.657092 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.657109 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.713836 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.713868 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.713997 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:09 crc kubenswrapper[4778]: E0930 17:19:09.714106 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.759415 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.759461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.759473 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.759496 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.759512 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.862154 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.862244 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.862257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.862275 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.862291 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.964934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.964981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.964996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.965014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:09 crc kubenswrapper[4778]: I0930 17:19:09.965028 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:09Z","lastTransitionTime":"2025-09-30T17:19:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.068024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.068071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.068084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.068104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.068117 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.170146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.170198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.170210 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.170232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.170247 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.272813 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.272885 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.272894 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.272908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.272918 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.375135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.375172 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.375183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.375201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.375211 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.477683 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.477742 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.477752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.477766 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.477776 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.579670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.579705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.579716 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.579730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.579739 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.682328 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.682416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.682435 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.682461 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.682478 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.713451 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.713451 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:10 crc kubenswrapper[4778]: E0930 17:19:10.713725 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:10 crc kubenswrapper[4778]: E0930 17:19:10.713874 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.785002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.785042 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.785051 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.785065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.785077 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.888072 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.888121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.888132 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.888149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.888160 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.990593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.990661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.990673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.990689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:10 crc kubenswrapper[4778]: I0930 17:19:10.990698 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:10Z","lastTransitionTime":"2025-09-30T17:19:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.093673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.093732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.093751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.093771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.093783 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.196748 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.196809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.196825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.196847 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.196860 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.299756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.299801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.299810 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.299825 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.299836 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.403754 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.403808 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.403818 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.403837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.403847 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.505919 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.506035 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.506049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.506064 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.506074 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.608220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.608283 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.608293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.608308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.608319 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.711391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.711470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.711498 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.711528 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.711549 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.713772 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:11 crc kubenswrapper[4778]: E0930 17:19:11.713954 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.714028 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:11 crc kubenswrapper[4778]: E0930 17:19:11.714154 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.731501 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.744186 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.758093 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.771259 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.782094 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bdb424-5ee8-4b98-bf00-e42c13f0eebe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cce4b7a820073251d209e35cc2595c5d766a0d77d399ece693243088624c282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2727c5209e9ea11c33f67caf07f39964942e1fd413cc2c88082ecd55a3604e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2727c5209e9ea11c33f67caf07f39964942e1fd413cc2c88082ecd55a3604e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.794788 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.805955 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.814508 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.814552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.814562 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.814578 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.814591 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.819010 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.839187 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.850733 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.868509 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.882916 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.897195 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.909189 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.917320 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.917351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.917360 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.917375 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.917384 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:11Z","lastTransitionTime":"2025-09-30T17:19:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.922683 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.939575 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:19:00Z\\\",\\\"message\\\":\\\" *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:19:00.501933 6817 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.501997 6817 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502069 6817 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502095 6817 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502147 6817 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502288 6817 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.514772 6817 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:19:00.514787 6817 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:19:00.514860 6817 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:19:00.514885 6817 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:19:00.514987 6817 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.956432 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.968580 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:11 crc kubenswrapper[4778]: I0930 17:19:11.985401 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:11Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.020263 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.020294 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.020302 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.020317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.020327 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.122550 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.122590 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.122600 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.122636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.122650 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.225723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.225759 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.225771 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.225787 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.225833 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.327708 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.327760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.327785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.327808 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.327823 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.430077 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.430116 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.430126 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.430141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.430152 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.533053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.533101 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.533111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.533127 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.533139 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.635061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.635094 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.635103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.635117 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.635127 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.713248 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.713323 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:12 crc kubenswrapper[4778]: E0930 17:19:12.713431 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:12 crc kubenswrapper[4778]: E0930 17:19:12.713592 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.738273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.738308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.738316 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.738332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.738344 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.840378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.840437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.840455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.840480 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.840500 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.944048 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.944084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.944097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.944113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:12 crc kubenswrapper[4778]: I0930 17:19:12.944124 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:12Z","lastTransitionTime":"2025-09-30T17:19:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.047301 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.047391 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.047411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.047436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.047453 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.151060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.151140 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.151178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.151213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.151238 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.255022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.255091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.255110 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.255135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.255155 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.357940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.357981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.357994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.358010 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.358028 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.461511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.461580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.461599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.461653 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.461675 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.563905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.563948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.563963 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.563986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.564002 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.666785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.666831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.666840 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.666856 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.666867 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.713845 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.713886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:13 crc kubenswrapper[4778]: E0930 17:19:13.714015 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:13 crc kubenswrapper[4778]: E0930 17:19:13.714101 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.769657 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.769721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.769737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.769768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.769786 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.872696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.872782 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.872801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.872826 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.872847 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.975801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.975880 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.975906 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.975939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:13 crc kubenswrapper[4778]: I0930 17:19:13.975964 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:13Z","lastTransitionTime":"2025-09-30T17:19:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.079281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.079359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.079372 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.079416 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.079431 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.182273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.182336 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.182346 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.182362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.182374 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.285436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.285515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.285534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.285560 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.285580 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.387994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.388030 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.388039 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.388052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.388063 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.490588 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.490675 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.490685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.490700 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.490710 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.593040 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.593114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.593125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.593140 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.593152 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.695467 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.695513 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.695526 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.695542 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.695554 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.713782 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.713814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:14 crc kubenswrapper[4778]: E0930 17:19:14.713972 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:14 crc kubenswrapper[4778]: E0930 17:19:14.714028 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.798519 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.798563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.798576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.798595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.798608 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.901135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.901186 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.901199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.901217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:14 crc kubenswrapper[4778]: I0930 17:19:14.901232 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:14Z","lastTransitionTime":"2025-09-30T17:19:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.004098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.004139 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.004148 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.004163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.004176 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.107439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.107499 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.107523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.107552 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.107574 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.210535 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.210608 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.210670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.210705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.210730 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.313692 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.313781 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.313819 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.313842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.313855 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.416864 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.416913 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.416927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.416943 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.416953 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.519982 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.520044 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.520061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.520083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.520100 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.621955 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.622002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.622011 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.622026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.622038 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.713455 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.713543 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:15 crc kubenswrapper[4778]: E0930 17:19:15.713681 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:15 crc kubenswrapper[4778]: E0930 17:19:15.713778 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.725074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.725126 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.725142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.725163 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.725178 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.827458 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.827506 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.827521 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.827539 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.827551 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.929933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.929986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.930004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.930022 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:15 crc kubenswrapper[4778]: I0930 17:19:15.930032 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:15Z","lastTransitionTime":"2025-09-30T17:19:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.032824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.032862 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.032878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.032899 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.032909 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.135413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.135466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.135475 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.135491 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.135502 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.238053 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.238106 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.238119 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.238137 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.238149 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.339755 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.339802 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.339816 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.339867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.339879 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.442514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.442664 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.442681 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.442703 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.442718 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.545358 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.545412 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.545422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.545438 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.545450 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.648450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.648509 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.648523 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.648543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.648558 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.713010 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.713095 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:16 crc kubenswrapper[4778]: E0930 17:19:16.713396 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:16 crc kubenswrapper[4778]: E0930 17:19:16.713521 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.714852 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:19:16 crc kubenswrapper[4778]: E0930 17:19:16.715125 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.751970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.752114 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.752128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.752149 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.752163 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.854572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.854656 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.854666 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.854682 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.854693 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.957650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.957698 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.957716 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.957743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:16 crc kubenswrapper[4778]: I0930 17:19:16.957756 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:16Z","lastTransitionTime":"2025-09-30T17:19:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.060930 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.060972 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.060984 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.061002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.061018 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.165125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.165207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.165227 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.165253 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.165270 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.267551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.267612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.267642 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.267661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.267678 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.370207 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.370251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.370261 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.370278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.370289 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.473308 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.473378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.473398 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.473428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.473451 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.577111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.577715 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.577908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.578103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.578306 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.686536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.686585 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.686595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.686632 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.686643 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.714070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.714147 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:17 crc kubenswrapper[4778]: E0930 17:19:17.715006 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:17 crc kubenswrapper[4778]: E0930 17:19:17.715157 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.789567 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.789671 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.789696 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.789730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.789756 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.892981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.893086 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.893103 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.893121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.893133 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.995743 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.995815 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.995834 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.995866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:17 crc kubenswrapper[4778]: I0930 17:19:17.995889 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:17Z","lastTransitionTime":"2025-09-30T17:19:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.099437 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.099482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.099492 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.099511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.099524 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.202364 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.202447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.202471 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.202504 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.202528 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.306404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.306487 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.306517 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.306553 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.306577 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.409318 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.409367 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.409378 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.409392 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.409404 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.512497 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.512572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.512589 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.512634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.512648 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.616067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.616150 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.616176 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.616211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.616238 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.713126 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.713179 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:18 crc kubenswrapper[4778]: E0930 17:19:18.713376 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:18 crc kubenswrapper[4778]: E0930 17:19:18.713557 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.719082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.719135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.719146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.719164 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.719175 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.822672 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.822932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.822956 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.822988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.823013 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.926762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.926817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.926830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.926849 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:18 crc kubenswrapper[4778]: I0930 17:19:18.926861 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:18Z","lastTransitionTime":"2025-09-30T17:19:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.030723 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.030789 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.030805 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.030831 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.030848 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.133293 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.133368 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.133386 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.133413 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.133432 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.237332 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.237436 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.237446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.237469 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.237484 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.343635 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.343705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.343713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.343733 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.343745 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.447090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.447183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.447201 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.447224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.447241 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.550109 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.550158 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.550167 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.550278 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.550298 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.575058 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.575153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.575171 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.575199 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.575221 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.590937 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.595460 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.595505 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.595514 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.595531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.595543 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.611386 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.617008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.617043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.617054 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.617073 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.617086 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.633262 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.638595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.638714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.638741 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.638785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.638810 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.655588 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.660930 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.661000 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.661017 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.661045 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.661064 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.680966 4778 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T17:19:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"11b51a32-4054-4a08-9c60-c43cf343227b\\\",\\\"systemUUID\\\":\\\"af2bf40f-07d8-4ec2-95f5-5dc6be6ea87c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:19Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.681153 4778 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.683655 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.683765 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.683779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.683801 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.683813 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.713482 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.713574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.713746 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:19 crc kubenswrapper[4778]: E0930 17:19:19.713967 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.786145 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.786202 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.786216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.786237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.786251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.889902 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.889961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.889981 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.890008 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.890028 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.993532 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.993686 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.993720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.993758 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:19 crc kubenswrapper[4778]: I0930 17:19:19.993782 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:19Z","lastTransitionTime":"2025-09-30T17:19:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.099004 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.099111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.099135 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.099170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.099197 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.202839 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.202905 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.202918 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.202945 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.202963 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.306709 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.306769 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.306785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.306807 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.306824 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.410322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.410390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.410404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.410429 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.410447 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.514155 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.514247 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.514272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.514311 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.514338 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.617466 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.617536 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.617551 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.617576 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.617588 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.671360 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:20 crc kubenswrapper[4778]: E0930 17:19:20.671562 4778 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:19:20 crc kubenswrapper[4778]: E0930 17:19:20.671683 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs podName:8b0c73d9-9a75-4e65-9220-904133af63fd nodeName:}" failed. No retries permitted until 2025-09-30 17:20:24.671656668 +0000 UTC m=+163.661554511 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs") pod "network-metrics-daemon-l88vm" (UID: "8b0c73d9-9a75-4e65-9220-904133af63fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.713779 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.713850 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:20 crc kubenswrapper[4778]: E0930 17:19:20.714047 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:20 crc kubenswrapper[4778]: E0930 17:19:20.714219 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.721387 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.721430 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.721447 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.721470 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.721489 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.825342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.825404 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.825422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.825446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.825466 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.928751 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.928799 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.928809 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.928829 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:20 crc kubenswrapper[4778]: I0930 17:19:20.928840 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:20Z","lastTransitionTime":"2025-09-30T17:19:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.031996 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.032065 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.032084 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.032113 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.032132 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.134967 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.135014 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.135026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.135043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.135056 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.237838 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.237948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.237975 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.238007 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.238033 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.341083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.341142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.341170 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.341196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.341214 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.445013 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.445087 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.445108 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.445142 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.445164 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.548846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.548909 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.548927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.548954 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.548972 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.651594 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.651697 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.651716 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.651739 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.651756 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.713871 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.713921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:21 crc kubenswrapper[4778]: E0930 17:19:21.714120 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:21 crc kubenswrapper[4778]: E0930 17:19:21.714277 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.738575 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1737d305-b819-48f8-b703-6b5549129dd2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://526255408a1dafaf05b48ec5f11f999f0a1b937a03ff4c4dbc682db5ae858c86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9eec3808e3e6033042ddfb909bcfaa53bc54a58472efdbf5ba2d86b9a5b3ab02\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://29e1b0d6d8dc50db78720b23d18846a4b72d8c5f48a0726ec20f4cb7e7af5566\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f4f4678639212590591bd7393d373ecdd123f6fa6a28c162a716ca6b7892477\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:05Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff55d5129b78695ccbe4c18ddd2418f8ef26fa55d3a978cfc8fb8db08ec33f76\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:06Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://742bee1af6ca31fb0645977e2181a32a5c982bf38eba89476c67c6de0122e023\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc641c3db2184fe8b9e63a8000577e88ca0b28865dc4632678332f503984a0b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jjsgb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-cwrn6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.754074 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.754128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.754138 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.754159 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.754169 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.762422 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"deb38969-9012-468f-87aa-2e70a5f8f3c4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:19:00Z\\\",\\\"message\\\":\\\" *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0930 17:19:00.501933 6817 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.501997 6817 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502069 6817 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502095 6817 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502147 6817 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.502288 6817 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 17:19:00.514772 6817 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0930 17:19:00.514787 6817 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0930 17:19:00.514860 6817 ovnkube.go:599] Stopped ovnkube\\\\nI0930 17:19:00.514885 6817 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0930 17:19:00.514987 6817 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mx6xj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:03Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kzlfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.778152 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b7c821bd6f2b2a89c7b6d87ce5586aa3220e3da0ddb8374f901363ceadf57d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.794920 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.807930 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v2b4f" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a2b9e51-adbe-4bba-9e7c-facada66c035\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67f36a3e59af288d0151a08ca026a78bcc48160ef329185321065a8f84bb88e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpdtj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v2b4f\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.827149 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vmbxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99e0ced4-d228-4bfa-a263-b8934f0d8e5d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T17:18:50Z\\\",\\\"message\\\":\\\"2025-09-30T17:18:04+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a\\\\n2025-09-30T17:18:04+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b66f689b-2731-4ddc-99ed-3624a395521a to /host/opt/cni/bin/\\\\n2025-09-30T17:18:04Z [verbose] multus-daemon started\\\\n2025-09-30T17:18:04Z [verbose] Readiness Indicator file check\\\\n2025-09-30T17:18:49Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqcg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vmbxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.845304 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kcwn2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa3bcb63-ebc5-4490-af95-b1325e664f48\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2b3eb9719eca97cba409dd59bcc6c3fd64461a6562f8055b61507c51c6d48282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x9vjp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:08Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kcwn2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.857258 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.857543 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.857609 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.857727 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.857796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.862244 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"faf8b25c-5e3f-4eee-8c3f-b384e2dafa92\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74153b9627fdcde579494b50b234cce3ef28a1a06c2330e838718c8f1600e59b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e6698ba65f05f0f8b00e815c47fd8f314a8b824b6f51dbde417ba00e3b81778\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fvbvh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:15Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hs6lf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.873395 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-l88vm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8b0c73d9-9a75-4e65-9220-904133af63fd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-88lz4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:16Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-l88vm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.886143 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4bdb424-5ee8-4b98-bf00-e42c13f0eebe\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6cce4b7a820073251d209e35cc2595c5d766a0d77d399ece693243088624c282\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e2727c5209e9ea11c33f67caf07f39964942e1fd413cc2c88082ecd55a3604e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e2727c5209e9ea11c33f67caf07f39964942e1fd413cc2c88082ecd55a3604e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.899384 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"66b4b432-1e27-4651-8833-d899bc384dd1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://caf0e976c966dec145ecd3eb79013b835ce2a3a942935cafd7e38d8e87b182aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f76c10db4f564ed99cb9f9d92fb32b9276ac7185ab96c1f2cdc2c1123ea44b6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0346e20c576a248b7661996c6dcbc130d6b235e6bf30989e2df843d917fd687c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aed62e52700bbc915a2ec550561f2abad948de4349eeb2946b9fe3f5b2c8dead\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.914774 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.927584 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58afc3c09e4ca0227598eee6b6778d6c789522fabe062801b5797d85a038b836\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e912233917c20707ec3a8bbd434eda12fd8313daf27a6afeab604e8809e0e630\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.941984 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.956970 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac448347-b650-429e-9e31-f8f9b7565f6e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d181ed119dbc7ce676896522e92401c87456e4b7b3978c37fc19ecfae252393d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wv5jd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:18:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-f5fmb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.960887 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.960928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.960938 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.960953 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.960963 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:21Z","lastTransitionTime":"2025-09-30T17:19:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.978129 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4001a1b-8481-482e-8498-f53d1573ad18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b4e505220a1611a1d2e8eab9a8134a4789b197171041f43db2ed26cb92b32b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5088facc5d7495ddb93c7d746a776af9ca1eeee96b12a565f1569d835dfdf698\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45ebace74fcc34f12945b8446225348f04c2f99279d51813084d189ef5b986be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e28123771f205d1b4583831d54e451ed222c0ea333f7c7e9e89e10b773829c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://59d021c1b37367f6612cb6187dd11361e2f59c5eb51d7d72993f33424d078fcb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://609f2d01ea5f67b02e001e90e6eeff503d5317f79f20202cd33dd96ac4704343\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0861bfdc2b25df1a8cb176507cde1547bd3002f1bbafbf686a3e74442f1e123\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2925c6e1d697035058469d66cdb420f20dba21befc423b9507101433f8fefe1d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:21 crc kubenswrapper[4778]: I0930 17:19:21.995383 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3bb16e52-d2c2-4f1b-a783-a6dde6a9b8c1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://627b2e19f23cc29d7bc96b27ac79b674982ad33b740968a20fcc510769744e48\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f36130c428840349f026f4940239548deadb2a06b753f6c32af1180e87921865\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70969f4cf6e65e5815943a825fccc4fc167d98f86cbaf23e467d0821ff6d4cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://07e4866de4904bdc242860308eff6a6834272956035d63c403b422e395d90a94\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a20ec354d5304cce3a6d7bac88c564d22fdad7b1b11e5af567e7270fe3b0607f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T17:18:01Z\\\",\\\"message\\\":\\\"le observer\\\\nW0930 17:18:01.481239 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0930 17:18:01.481423 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 17:18:01.482552 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3020905986/tls.crt::/tmp/serving-cert-3020905986/tls.key\\\\\\\"\\\\nI0930 17:18:01.770985 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0930 17:18:01.775525 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0930 17:18:01.775553 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0930 17:18:01.775578 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0930 17:18:01.775583 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0930 17:18:01.791072 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0930 17:18:01.791128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791140 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0930 17:18:01.791150 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0930 17:18:01.791159 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0930 17:18:01.791166 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0930 17:18:01.791173 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0930 17:18:01.791418 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0930 17:18:01.794086 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:56Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6eef912d10cb16a0e4b9cde45f41b43e3f6673e58fb00c3bb52d4087648d8d80\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:44Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0954e1ca579e53873f484d781dacca9ccc4845be7a8e424c991114bb4e087c0a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:21Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.011654 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a21377c-3b21-4362-a4f7-0965f1857b50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T17:17:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ce2f1710fd95f4c2ed5d42d0bacbf201f33504b6ca700d4bd46242d831650719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5aefa2a593ab97c6e589754e59c133187bcb0e79a03506006467f2605ce50362\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c0386bb49c3d9173d75b49c20feba3144ef67a71169719c11e1d9a5a9c5eca6d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0668df8cb60366f0640eb31187e35daea5290901793c2b51eb8c3bb898d8badf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:17:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T17:17:41Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:22Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.025235 4778 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T17:18:04Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7450aff53e918fdb82e3d2a212f47c65a1ae0adfb0321b5113e53f2c82a3429e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T17:18:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T17:19:22Z is after 2025-08-24T17:21:41Z" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.063216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.063271 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.063281 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.063303 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.063316 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.166609 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.166737 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.166752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.166775 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.166792 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.269828 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.269873 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.269882 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.269900 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.269910 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.373516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.374175 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.374388 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.374580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.374888 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.478343 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.478397 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.478406 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.478423 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.478435 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.581146 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.581212 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.581225 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.581242 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.581254 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.684071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.684111 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.684121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.684136 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.684147 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.713643 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.713708 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:22 crc kubenswrapper[4778]: E0930 17:19:22.713884 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:22 crc kubenswrapper[4778]: E0930 17:19:22.714034 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.787313 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.787349 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.787357 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.787371 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.787384 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.890944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.890986 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.890999 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.891015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.891026 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.994897 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.994940 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.994951 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.994973 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:22 crc kubenswrapper[4778]: I0930 17:19:22.995023 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:22Z","lastTransitionTime":"2025-09-30T17:19:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.097555 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.097634 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.097645 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.097670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.097684 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.200286 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.200351 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.200363 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.200381 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.200395 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.303729 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.303785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.303798 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.303820 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.303834 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.406907 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.406961 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.406971 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.406994 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.407005 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.509660 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.509721 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.509734 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.509760 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.509773 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.613141 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.613200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.613211 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.613234 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.613248 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.713697 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.714087 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:23 crc kubenswrapper[4778]: E0930 17:19:23.714251 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:23 crc kubenswrapper[4778]: E0930 17:19:23.714542 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.715644 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.715673 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.715732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.715762 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.715775 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.819874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.819911 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.819921 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.819939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.819952 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.923187 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.923251 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.923273 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.923297 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:23 crc kubenswrapper[4778]: I0930 17:19:23.923311 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:23Z","lastTransitionTime":"2025-09-30T17:19:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.026224 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.026285 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.026298 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.026322 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.026342 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.129554 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.129644 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.129662 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.129685 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.129701 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.232970 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.233104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.233121 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.233144 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.233160 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.335926 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.336028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.336056 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.336091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.336116 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.439482 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.439593 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.439612 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.439695 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.439717 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.542934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.542992 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.543005 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.543028 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.543043 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.645867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.645915 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.645928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.645944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.645956 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.713222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.713222 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:24 crc kubenswrapper[4778]: E0930 17:19:24.713400 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:24 crc kubenswrapper[4778]: E0930 17:19:24.713458 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.748867 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.748914 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.748928 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.748948 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.748961 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.852240 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.852305 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.852317 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.852342 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.852356 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.954978 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.955100 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.955123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.955153 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:24 crc kubenswrapper[4778]: I0930 17:19:24.955171 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:24Z","lastTransitionTime":"2025-09-30T17:19:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.059083 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.059161 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.059183 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.059212 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.059237 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.163309 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.163450 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.163464 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.163486 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.163499 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.267015 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.267067 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.267076 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.267098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.267115 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.370564 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.370636 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.370650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.370670 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.370680 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.473779 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.473844 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.473866 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.473890 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.473906 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.576987 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.577049 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.577061 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.577088 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.577102 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.680713 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.680756 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.680768 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.680785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.680796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.713709 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.713763 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:25 crc kubenswrapper[4778]: E0930 17:19:25.713905 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:25 crc kubenswrapper[4778]: E0930 17:19:25.714083 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.783939 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.784052 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.784072 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.784097 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.784117 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.886712 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.886817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.886836 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.886896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.886915 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.989878 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.989933 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.989944 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.989965 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:25 crc kubenswrapper[4778]: I0930 17:19:25.989978 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:25Z","lastTransitionTime":"2025-09-30T17:19:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.093689 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.093736 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.093749 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.093767 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.093781 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.197157 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.197209 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.197220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.197239 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.197251 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.299980 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.300071 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.300090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.300124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.300143 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.403128 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.403178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.403189 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.403208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.403220 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.506699 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.506785 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.506804 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.506830 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.506845 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.610418 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.610489 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.610531 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.610570 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.610596 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.713655 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.713787 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:26 crc kubenswrapper[4778]: E0930 17:19:26.713922 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.714012 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: E0930 17:19:26.714032 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.714055 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.714304 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.714370 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.714454 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.818651 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.818714 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.818730 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.818752 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.818768 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.923720 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.923827 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.923846 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.923903 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:26 crc kubenswrapper[4778]: I0930 17:19:26.923928 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:26Z","lastTransitionTime":"2025-09-30T17:19:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.028243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.028340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.028362 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.028390 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.028439 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.132359 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.132428 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.132446 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.132474 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.132495 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.235677 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.235732 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.235750 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.235778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.235796 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.339091 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.339200 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.339213 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.339232 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.339245 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.442173 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.442237 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.442257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.442284 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.442304 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.546932 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.547025 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.547060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.547095 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.547120 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.650338 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.650411 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.650422 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.650455 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.650465 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.713800 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.713745 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:27 crc kubenswrapper[4778]: E0930 17:19:27.714021 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:27 crc kubenswrapper[4778]: E0930 17:19:27.714248 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.754515 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.754563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.754572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.754595 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.754610 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.857019 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.857070 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.857082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.857102 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.857296 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.959927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.959989 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.960002 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.960026 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:27 crc kubenswrapper[4778]: I0930 17:19:27.960040 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:27Z","lastTransitionTime":"2025-09-30T17:19:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.063125 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.063185 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.063196 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.063218 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.063233 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.171988 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.172874 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.172912 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.172936 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.172951 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.276248 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.276295 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.276307 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.276325 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.276336 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.379842 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.379896 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.379908 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.379927 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.379939 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.483107 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.483198 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.483217 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.483249 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.483272 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.586744 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.586817 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.586837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.586871 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.586893 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.690272 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.690340 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.690365 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.690393 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.690413 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.713299 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.713431 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:28 crc kubenswrapper[4778]: E0930 17:19:28.713708 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:28 crc kubenswrapper[4778]: E0930 17:19:28.713982 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.793098 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.793178 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.793194 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.793216 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.793227 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.896778 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.896824 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.896837 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.896855 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:28 crc kubenswrapper[4778]: I0930 17:19:28.896869 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:28Z","lastTransitionTime":"2025-09-30T17:19:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.000179 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.000220 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.000228 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.000243 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.000253 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.102581 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.102637 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.102650 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.102668 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.102681 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.206203 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.206270 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.206287 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.206312 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.206329 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.309043 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.309082 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.309090 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.309104 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.309117 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.412439 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.412516 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.412534 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.412563 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.412582 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.516511 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.516580 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.516599 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.516661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.516683 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.620572 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.620684 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.620705 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.620735 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.620752 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.713332 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.713349 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:29 crc kubenswrapper[4778]: E0930 17:19:29.713656 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:29 crc kubenswrapper[4778]: E0930 17:19:29.713879 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.723124 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.723208 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.723231 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.723257 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.723277 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.826998 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.827063 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.827081 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.827123 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.827143 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.924934 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.925006 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.925024 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.925060 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.925084 4778 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T17:19:29Z","lastTransitionTime":"2025-09-30T17:19:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.995101 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r"] Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.995572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.998034 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 17:19:29 crc kubenswrapper[4778]: I0930 17:19:29.999796 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:29.999976 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:29.999979 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.018891 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podStartSLOduration=88.018858641 podStartE2EDuration="1m28.018858641s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.018679036 +0000 UTC m=+109.008576849" watchObservedRunningTime="2025-09-30 17:19:30.018858641 +0000 UTC m=+109.008756474" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.071318 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=87.071282409 podStartE2EDuration="1m27.071282409s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.052650185 +0000 UTC m=+109.042548008" watchObservedRunningTime="2025-09-30 17:19:30.071282409 +0000 UTC m=+109.061180252" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.071712 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=88.071704853 podStartE2EDuration="1m28.071704853s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.071065752 +0000 UTC m=+109.060963565" watchObservedRunningTime="2025-09-30 17:19:30.071704853 +0000 UTC m=+109.061602696" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.082406 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.082475 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.082502 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.082530 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.082678 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.158982 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=87.158958868 podStartE2EDuration="1m27.158958868s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.146227792 +0000 UTC m=+109.136125605" watchObservedRunningTime="2025-09-30 17:19:30.158958868 +0000 UTC m=+109.148856671" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183701 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183800 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183827 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183854 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183980 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.183979 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.184876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.190274 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.207245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/676ef38e-f7bd-43ef-a18f-1d39797bd8e8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-t8h5r\" (UID: \"676ef38e-f7bd-43ef-a18f-1d39797bd8e8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.213203 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cwrn6" podStartSLOduration=88.213172955 podStartE2EDuration="1m28.213172955s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.212899196 +0000 UTC m=+109.202797049" watchObservedRunningTime="2025-09-30 17:19:30.213172955 +0000 UTC m=+109.203070778" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.260513 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hs6lf" podStartSLOduration=87.260491892 podStartE2EDuration="1m27.260491892s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.259422586 +0000 UTC m=+109.249320389" watchObservedRunningTime="2025-09-30 17:19:30.260491892 +0000 UTC m=+109.250389695" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.288027 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.287997924 podStartE2EDuration="25.287997924s" podCreationTimestamp="2025-09-30 17:19:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.28727802 +0000 UTC m=+109.277175833" watchObservedRunningTime="2025-09-30 17:19:30.287997924 +0000 UTC m=+109.277895737" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.301737 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.301714684 podStartE2EDuration="53.301714684s" podCreationTimestamp="2025-09-30 17:18:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.300770252 +0000 UTC m=+109.290668055" watchObservedRunningTime="2025-09-30 17:19:30.301714684 +0000 UTC m=+109.291612497" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.313642 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v2b4f" podStartSLOduration=88.313592092 podStartE2EDuration="1m28.313592092s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.312513426 +0000 UTC m=+109.302411229" watchObservedRunningTime="2025-09-30 17:19:30.313592092 +0000 UTC m=+109.303489905" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.316110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" Sep 30 17:19:30 crc kubenswrapper[4778]: W0930 17:19:30.333971 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676ef38e_f7bd_43ef_a18f_1d39797bd8e8.slice/crio-45b793fc9aa39e9c22796cfb80ad393b2cc3e41be9da68d128e9b1f117b2360c WatchSource:0}: Error finding container 45b793fc9aa39e9c22796cfb80ad393b2cc3e41be9da68d128e9b1f117b2360c: Status 404 returned error can't find the container with id 45b793fc9aa39e9c22796cfb80ad393b2cc3e41be9da68d128e9b1f117b2360c Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.343034 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vmbxd" podStartSLOduration=88.343009838 podStartE2EDuration="1m28.343009838s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.342995768 +0000 UTC m=+109.332893581" watchObservedRunningTime="2025-09-30 17:19:30.343009838 +0000 UTC m=+109.332907641" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.359199 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kcwn2" podStartSLOduration=87.35917242 podStartE2EDuration="1m27.35917242s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:30.357415541 +0000 UTC m=+109.347313384" watchObservedRunningTime="2025-09-30 17:19:30.35917242 +0000 UTC m=+109.349070223" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.363950 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" event={"ID":"676ef38e-f7bd-43ef-a18f-1d39797bd8e8","Type":"ContainerStarted","Data":"45b793fc9aa39e9c22796cfb80ad393b2cc3e41be9da68d128e9b1f117b2360c"} Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.713389 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:30 crc kubenswrapper[4778]: I0930 17:19:30.713515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:30 crc kubenswrapper[4778]: E0930 17:19:30.713548 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:30 crc kubenswrapper[4778]: E0930 17:19:30.713767 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:31 crc kubenswrapper[4778]: I0930 17:19:31.371481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" event={"ID":"676ef38e-f7bd-43ef-a18f-1d39797bd8e8","Type":"ContainerStarted","Data":"bad8962dcb0e70cf54417f5218b95d2e13d188941d6da615c8e166c888fb150d"} Sep 30 17:19:31 crc kubenswrapper[4778]: I0930 17:19:31.392399 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-t8h5r" podStartSLOduration=89.392368864 podStartE2EDuration="1m29.392368864s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:31.390767721 +0000 UTC m=+110.380665564" watchObservedRunningTime="2025-09-30 17:19:31.392368864 +0000 UTC m=+110.382266707" Sep 30 17:19:31 crc kubenswrapper[4778]: I0930 17:19:31.714113 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:31 crc kubenswrapper[4778]: I0930 17:19:31.714170 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:31 crc kubenswrapper[4778]: E0930 17:19:31.714888 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:31 crc kubenswrapper[4778]: E0930 17:19:31.715104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:31 crc kubenswrapper[4778]: I0930 17:19:31.716100 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:19:31 crc kubenswrapper[4778]: E0930 17:19:31.716301 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kzlfx_openshift-ovn-kubernetes(deb38969-9012-468f-87aa-2e70a5f8f3c4)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" Sep 30 17:19:32 crc kubenswrapper[4778]: I0930 17:19:32.713207 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:32 crc kubenswrapper[4778]: I0930 17:19:32.713266 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:32 crc kubenswrapper[4778]: E0930 17:19:32.713701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:32 crc kubenswrapper[4778]: E0930 17:19:32.713853 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:33 crc kubenswrapper[4778]: I0930 17:19:33.713845 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:33 crc kubenswrapper[4778]: I0930 17:19:33.713861 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:33 crc kubenswrapper[4778]: E0930 17:19:33.714090 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:33 crc kubenswrapper[4778]: E0930 17:19:33.714399 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:34 crc kubenswrapper[4778]: I0930 17:19:34.713178 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:34 crc kubenswrapper[4778]: I0930 17:19:34.713224 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:34 crc kubenswrapper[4778]: E0930 17:19:34.713347 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:34 crc kubenswrapper[4778]: E0930 17:19:34.713448 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:35 crc kubenswrapper[4778]: I0930 17:19:35.713734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:35 crc kubenswrapper[4778]: E0930 17:19:35.713886 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:35 crc kubenswrapper[4778]: I0930 17:19:35.713958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:35 crc kubenswrapper[4778]: E0930 17:19:35.714181 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.388072 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/1.log" Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.388520 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/0.log" Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.388560 4778 generic.go:334] "Generic (PLEG): container finished" podID="99e0ced4-d228-4bfa-a263-b8934f0d8e5d" containerID="b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75" exitCode=1 Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.388611 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerDied","Data":"b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75"} Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.388689 4778 scope.go:117] "RemoveContainer" containerID="7abecf439cc8da8e56c68bb5af8af0477467584d20afb1c708d4ce84d959c233" Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.389140 4778 scope.go:117] "RemoveContainer" containerID="b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75" Sep 30 17:19:36 crc kubenswrapper[4778]: E0930 17:19:36.389312 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vmbxd_openshift-multus(99e0ced4-d228-4bfa-a263-b8934f0d8e5d)\"" pod="openshift-multus/multus-vmbxd" podUID="99e0ced4-d228-4bfa-a263-b8934f0d8e5d" Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.713219 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:36 crc kubenswrapper[4778]: I0930 17:19:36.713403 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:36 crc kubenswrapper[4778]: E0930 17:19:36.713558 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:36 crc kubenswrapper[4778]: E0930 17:19:36.713758 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:37 crc kubenswrapper[4778]: I0930 17:19:37.393662 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/1.log" Sep 30 17:19:37 crc kubenswrapper[4778]: I0930 17:19:37.713372 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:37 crc kubenswrapper[4778]: I0930 17:19:37.713450 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:37 crc kubenswrapper[4778]: E0930 17:19:37.713757 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:37 crc kubenswrapper[4778]: E0930 17:19:37.713897 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:38 crc kubenswrapper[4778]: I0930 17:19:38.713329 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:38 crc kubenswrapper[4778]: I0930 17:19:38.713338 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:38 crc kubenswrapper[4778]: E0930 17:19:38.713578 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:38 crc kubenswrapper[4778]: E0930 17:19:38.713792 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:39 crc kubenswrapper[4778]: I0930 17:19:39.713590 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:39 crc kubenswrapper[4778]: I0930 17:19:39.713793 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:39 crc kubenswrapper[4778]: E0930 17:19:39.713973 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:39 crc kubenswrapper[4778]: E0930 17:19:39.714112 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:40 crc kubenswrapper[4778]: I0930 17:19:40.713570 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:40 crc kubenswrapper[4778]: E0930 17:19:40.713714 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:40 crc kubenswrapper[4778]: I0930 17:19:40.714066 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:40 crc kubenswrapper[4778]: E0930 17:19:40.714350 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:41 crc kubenswrapper[4778]: I0930 17:19:41.713344 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:41 crc kubenswrapper[4778]: I0930 17:19:41.713365 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:41 crc kubenswrapper[4778]: E0930 17:19:41.716258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:41 crc kubenswrapper[4778]: E0930 17:19:41.716400 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:41 crc kubenswrapper[4778]: E0930 17:19:41.746348 4778 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 17:19:41 crc kubenswrapper[4778]: E0930 17:19:41.832523 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:19:42 crc kubenswrapper[4778]: I0930 17:19:42.713111 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:42 crc kubenswrapper[4778]: I0930 17:19:42.713219 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:42 crc kubenswrapper[4778]: E0930 17:19:42.713443 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:42 crc kubenswrapper[4778]: E0930 17:19:42.713581 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:43 crc kubenswrapper[4778]: I0930 17:19:43.713173 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:43 crc kubenswrapper[4778]: E0930 17:19:43.713481 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:43 crc kubenswrapper[4778]: I0930 17:19:43.713986 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:43 crc kubenswrapper[4778]: E0930 17:19:43.714169 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:44 crc kubenswrapper[4778]: I0930 17:19:44.713892 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:44 crc kubenswrapper[4778]: I0930 17:19:44.714070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:44 crc kubenswrapper[4778]: E0930 17:19:44.714104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:44 crc kubenswrapper[4778]: E0930 17:19:44.714299 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:44 crc kubenswrapper[4778]: I0930 17:19:44.715536 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.431124 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/3.log" Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.434003 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerStarted","Data":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.434520 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.468924 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podStartSLOduration=103.468901262 podStartE2EDuration="1m43.468901262s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:19:45.467388871 +0000 UTC m=+124.457286674" watchObservedRunningTime="2025-09-30 17:19:45.468901262 +0000 UTC m=+124.458799075" Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.712985 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.713095 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:45 crc kubenswrapper[4778]: E0930 17:19:45.713220 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:45 crc kubenswrapper[4778]: E0930 17:19:45.713296 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.912273 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l88vm"] Sep 30 17:19:45 crc kubenswrapper[4778]: I0930 17:19:45.912515 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:45 crc kubenswrapper[4778]: E0930 17:19:45.912779 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:46 crc kubenswrapper[4778]: I0930 17:19:46.713433 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:46 crc kubenswrapper[4778]: E0930 17:19:46.714256 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:46 crc kubenswrapper[4778]: E0930 17:19:46.834029 4778 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 17:19:47 crc kubenswrapper[4778]: I0930 17:19:47.713567 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:47 crc kubenswrapper[4778]: I0930 17:19:47.713681 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:47 crc kubenswrapper[4778]: I0930 17:19:47.713791 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:47 crc kubenswrapper[4778]: E0930 17:19:47.713783 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:47 crc kubenswrapper[4778]: E0930 17:19:47.714017 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:47 crc kubenswrapper[4778]: E0930 17:19:47.714112 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:48 crc kubenswrapper[4778]: I0930 17:19:48.714308 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:48 crc kubenswrapper[4778]: E0930 17:19:48.714550 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:48 crc kubenswrapper[4778]: I0930 17:19:48.715273 4778 scope.go:117] "RemoveContainer" containerID="b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75" Sep 30 17:19:49 crc kubenswrapper[4778]: I0930 17:19:49.451079 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/1.log" Sep 30 17:19:49 crc kubenswrapper[4778]: I0930 17:19:49.451524 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerStarted","Data":"5555f4f469382c30ef23a0546878aecb39a8bf359d2403619aa573b16c9c3a19"} Sep 30 17:19:49 crc kubenswrapper[4778]: I0930 17:19:49.714081 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:49 crc kubenswrapper[4778]: I0930 17:19:49.714147 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:49 crc kubenswrapper[4778]: I0930 17:19:49.714200 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:49 crc kubenswrapper[4778]: E0930 17:19:49.714312 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:49 crc kubenswrapper[4778]: E0930 17:19:49.714476 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:49 crc kubenswrapper[4778]: E0930 17:19:49.714644 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:50 crc kubenswrapper[4778]: I0930 17:19:50.713672 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:50 crc kubenswrapper[4778]: E0930 17:19:50.713872 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 17:19:51 crc kubenswrapper[4778]: I0930 17:19:51.713748 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:51 crc kubenswrapper[4778]: I0930 17:19:51.713891 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:51 crc kubenswrapper[4778]: E0930 17:19:51.716698 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 17:19:51 crc kubenswrapper[4778]: I0930 17:19:51.716747 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:51 crc kubenswrapper[4778]: E0930 17:19:51.716859 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 17:19:51 crc kubenswrapper[4778]: E0930 17:19:51.717089 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l88vm" podUID="8b0c73d9-9a75-4e65-9220-904133af63fd" Sep 30 17:19:52 crc kubenswrapper[4778]: I0930 17:19:52.713734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:19:52 crc kubenswrapper[4778]: I0930 17:19:52.716359 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 17:19:52 crc kubenswrapper[4778]: I0930 17:19:52.717994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.713229 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.713276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.713276 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.717211 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.717394 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.717644 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 17:19:53 crc kubenswrapper[4778]: I0930 17:19:53.717655 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.840661 4778 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.878254 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.878992 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.879022 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zwnt5"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.879756 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.879899 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.880206 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.880647 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-pl4fp"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.880902 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.881460 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxgcd"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.881773 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.885315 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hm822"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.885894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.887322 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.887600 4778 reflector.go:561] object-"openshift-controller-manager"/"config": failed to list *v1.ConfigMap: configmaps "config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.887665 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.887703 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.887782 4778 reflector.go:561] object-"openshift-controller-manager"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.887800 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.887849 4778 reflector.go:561] object-"openshift-controller-manager"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.887861 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.888024 4778 reflector.go:561] object-"openshift-controller-manager"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.888068 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.888544 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.888679 4778 reflector.go:561] object-"openshift-controller-manager"/"client-ca": failed to list *v1.ConfigMap: configmaps "client-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.888753 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.888742 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"client-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"client-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.889760 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nprqp"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.890207 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.892711 4778 reflector.go:561] object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c": failed to list *v1.Secret: secrets "openshift-controller-manager-sa-dockercfg-msq4c" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-controller-manager": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.892762 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-controller-manager\"/\"openshift-controller-manager-sa-dockercfg-msq4c\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"openshift-controller-manager-sa-dockercfg-msq4c\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-controller-manager\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.892891 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.893134 4778 reflector.go:561] object-"openshift-console"/"console-dockercfg-f62pw": failed to list *v1.Secret: secrets "console-dockercfg-f62pw" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.893160 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-dockercfg-f62pw\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-dockercfg-f62pw\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.893216 4778 reflector.go:561] object-"openshift-console"/"console-config": failed to list *v1.ConfigMap: configmaps "console-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.893231 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"console-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.893289 4778 reflector.go:561] object-"openshift-console"/"console-oauth-config": failed to list *v1.Secret: secrets "console-oauth-config" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.893305 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-oauth-config\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-oauth-config\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.893428 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.893590 4778 reflector.go:561] object-"openshift-console"/"oauth-serving-cert": failed to list *v1.ConfigMap: configmaps "oauth-serving-cert" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.893611 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"oauth-serving-cert\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"oauth-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.899222 4778 reflector.go:561] object-"openshift-console"/"service-ca": failed to list *v1.ConfigMap: configmaps "service-ca" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.899278 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"service-ca\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"service-ca\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.899347 4778 reflector.go:561] object-"openshift-console"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.899363 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.900475 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.900993 4778 reflector.go:561] object-"openshift-console-operator"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.901049 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.901169 4778 reflector.go:561] object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr": failed to list *v1.Secret: secrets "console-operator-dockercfg-4xjcr" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.901197 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4xjcr\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-operator-dockercfg-4xjcr\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.901259 4778 reflector.go:561] object-"openshift-console-operator"/"serving-cert": failed to list *v1.Secret: secrets "serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.901281 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.902550 4778 reflector.go:561] object-"openshift-console-operator"/"console-operator-config": failed to list *v1.ConfigMap: configmaps "console-operator-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console-operator": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.902580 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console-operator\"/\"console-operator-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"console-operator-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.902654 4778 reflector.go:561] object-"openshift-console"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.902670 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.902740 4778 reflector.go:561] object-"openshift-console"/"trusted-ca-bundle": failed to list *v1.ConfigMap: configmaps "trusted-ca-bundle" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.902754 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"trusted-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"trusted-ca-bundle\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: W0930 17:20:00.902794 4778 reflector.go:561] object-"openshift-console"/"console-serving-cert": failed to list *v1.Secret: secrets "console-serving-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'crc' and this object Sep 30 17:20:00 crc kubenswrapper[4778]: E0930 17:20:00.902807 4778 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"console-serving-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"console-serving-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'crc' and this object" logger="UnhandledError" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.902866 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k755p"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.903333 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.903400 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.903859 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.904022 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.904115 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.904591 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.911833 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.912793 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.913423 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.913716 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.914496 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.914897 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.915117 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.917924 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.928462 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.929147 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.929652 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.929907 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.930343 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.930510 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.930692 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.930792 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.930934 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.931213 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.931398 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.931493 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.931581 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.931933 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932002 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932043 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932157 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932224 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932316 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932055 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932379 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.931842 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.932637 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.933116 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.937363 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtc5"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.938149 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-56687"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.938483 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.938585 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.943258 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.943410 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.943496 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.943579 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.943818 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.944656 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.947837 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.948935 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.949002 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.949318 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.949516 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.949719 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xr85z"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.950337 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.949732 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.956735 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b72ms"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.957662 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sztwq"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.957999 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.958330 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.958364 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.958364 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959184 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959292 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959429 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959576 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959679 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959698 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959842 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959918 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.959992 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960059 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960096 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960141 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960209 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960286 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960362 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960508 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.960608 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.961145 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.961387 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n44qt\" (UniqueName: \"kubernetes.io/projected/7bc95114-073a-45f7-bf10-c008e08c6e52-kube-api-access-n44qt\") pod \"cluster-samples-operator-665b6dd947-v4spf\" (UID: \"7bc95114-073a-45f7-bf10-c008e08c6e52\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962054 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96905ba5-6042-4555-aad7-0ac5abb5e6e2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962070 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc95114-073a-45f7-bf10-c008e08c6e52-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v4spf\" (UID: \"7bc95114-073a-45f7-bf10-c008e08c6e52\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962086 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-client-ca\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962120 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-serving-cert\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962136 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ktmg\" (UniqueName: \"kubernetes.io/projected/971714d1-d7ca-458a-98a3-7f0172c2e3c1-kube-api-access-7ktmg\") pod \"dns-operator-744455d44c-zwnt5\" (UID: \"971714d1-d7ca-458a-98a3-7f0172c2e3c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962157 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96v5t\" (UniqueName: \"kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962175 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-audit-policies\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962190 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f326088-a1d0-43ee-9b2a-41e7ac797679-serving-cert\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962211 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9vwx\" (UniqueName: \"kubernetes.io/projected/1075512e-3cc0-40fe-938c-07f5baaf964e-kube-api-access-r9vwx\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962250 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96905ba5-6042-4555-aad7-0ac5abb5e6e2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962265 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkph7\" (UniqueName: \"kubernetes.io/projected/a6003c01-a0dc-4474-88f8-cf178d542b63-kube-api-access-nkph7\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962295 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-encryption-config\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962309 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-serving-cert\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962338 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962356 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-ca\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962458 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdtxj\" (UniqueName: \"kubernetes.io/projected/96905ba5-6042-4555-aad7-0ac5abb5e6e2-kube-api-access-tdtxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962520 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gx2\" (UniqueName: \"kubernetes.io/projected/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-kube-api-access-48gx2\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962551 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/971714d1-d7ca-458a-98a3-7f0172c2e3c1-metrics-tls\") pod \"dns-operator-744455d44c-zwnt5\" (UID: \"971714d1-d7ca-458a-98a3-7f0172c2e3c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962585 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-etcd-client\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962645 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962665 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-serving-cert\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962691 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgczg\" (UniqueName: \"kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962751 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962769 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6003c01-a0dc-4474-88f8-cf178d542b63-config\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-trusted-ca\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962881 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6003c01-a0dc-4474-88f8-cf178d542b63-auth-proxy-config\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962897 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nbbq\" (UniqueName: \"kubernetes.io/projected/5f326088-a1d0-43ee-9b2a-41e7ac797679-kube-api-access-5nbbq\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962901 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962960 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ck2p\" (UniqueName: \"kubernetes.io/projected/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-kube-api-access-5ck2p\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.962996 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963029 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-config\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963048 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmmpd\" (UniqueName: \"kubernetes.io/projected/38eeb4e2-2d0a-43c3-b305-aa464ec83096-kube-api-access-mmmpd\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1075512e-3cc0-40fe-938c-07f5baaf964e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963130 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-client\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963152 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-config\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963183 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-service-ca\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963200 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a6003c01-a0dc-4474-88f8-cf178d542b63-machine-approver-tls\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963218 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38eeb4e2-2d0a-43c3-b305-aa464ec83096-audit-dir\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963237 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1075512e-3cc0-40fe-938c-07f5baaf964e-serving-cert\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963294 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.963495 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.964408 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.965689 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.966214 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.966458 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.966560 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zwnt5"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.966591 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7njlp"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.987828 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.989119 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.991891 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8"] Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.998905 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 17:20:00 crc kubenswrapper[4778]: I0930 17:20:00.999018 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.012077 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.012319 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.012744 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.013564 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.015303 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.017138 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.017608 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.017691 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.018604 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.020885 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-svds2"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.021847 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.024678 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxgcd"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.025083 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.025644 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.027933 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pl4fp"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.028371 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.030944 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.034951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.035014 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.035886 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.036182 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dtqsp"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.037140 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.041132 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nprqp"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.043328 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.045006 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.045425 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k755p"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.046338 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.046478 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.047993 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.048731 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.067957 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e2325-09e6-41a0-a0ad-67c3e01d2627-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.067991 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e848ffb5-3244-4b27-a090-c63685145174-audit-dir\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068030 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1075512e-3cc0-40fe-938c-07f5baaf964e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068068 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxvcz\" (UniqueName: \"kubernetes.io/projected/b30e2325-09e6-41a0-a0ad-67c3e01d2627-kube-api-access-vxvcz\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068100 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e848ffb5-3244-4b27-a090-c63685145174-node-pullsecrets\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068116 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpfsz\" (UniqueName: \"kubernetes.io/projected/7b2932cf-321f-41d9-b5f7-e9969592da8d-kube-api-access-lpfsz\") pod \"multus-admission-controller-857f4d67dd-7njlp\" (UID: \"7b2932cf-321f-41d9-b5f7-e9969592da8d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-client\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068149 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068196 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-config\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068209 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-service-ca\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068224 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a6003c01-a0dc-4474-88f8-cf178d542b63-machine-approver-tls\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-config\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068288 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8fjk\" (UniqueName: \"kubernetes.io/projected/4d85180d-59c0-4f8f-8481-170f27db08b6-kube-api-access-z8fjk\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068303 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1075512e-3cc0-40fe-938c-07f5baaf964e-serving-cert\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068333 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38eeb4e2-2d0a-43c3-b305-aa464ec83096-audit-dir\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-config\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068372 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068386 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068423 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96905ba5-6042-4555-aad7-0ac5abb5e6e2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068440 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc95114-073a-45f7-bf10-c008e08c6e52-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v4spf\" (UID: \"7bc95114-073a-45f7-bf10-c008e08c6e52\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068457 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n44qt\" (UniqueName: \"kubernetes.io/projected/7bc95114-073a-45f7-bf10-c008e08c6e52-kube-api-access-n44qt\") pod \"cluster-samples-operator-665b6dd947-v4spf\" (UID: \"7bc95114-073a-45f7-bf10-c008e08c6e52\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068500 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068519 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-client-ca\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068534 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d85180d-59c0-4f8f-8481-170f27db08b6-service-ca-bundle\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068578 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-serving-cert\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068594 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b30e2325-09e6-41a0-a0ad-67c3e01d2627-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068611 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ktmg\" (UniqueName: \"kubernetes.io/projected/971714d1-d7ca-458a-98a3-7f0172c2e3c1-kube-api-access-7ktmg\") pod \"dns-operator-744455d44c-zwnt5\" (UID: \"971714d1-d7ca-458a-98a3-7f0172c2e3c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96v5t\" (UniqueName: \"kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068663 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-audit-policies\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068679 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f326088-a1d0-43ee-9b2a-41e7ac797679-serving-cert\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068721 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9vwx\" (UniqueName: \"kubernetes.io/projected/1075512e-3cc0-40fe-938c-07f5baaf964e-kube-api-access-r9vwx\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068751 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068767 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068798 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-encryption-config\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96905ba5-6042-4555-aad7-0ac5abb5e6e2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068834 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkph7\" (UniqueName: \"kubernetes.io/projected/a6003c01-a0dc-4474-88f8-cf178d542b63-kube-api-access-nkph7\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068850 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-encryption-config\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-serving-cert\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-default-certificate\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068963 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-serving-cert\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.068980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-serving-cert\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdtxj\" (UniqueName: \"kubernetes.io/projected/96905ba5-6042-4555-aad7-0ac5abb5e6e2-kube-api-access-tdtxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069038 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48gx2\" (UniqueName: \"kubernetes.io/projected/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-kube-api-access-48gx2\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069053 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-ca\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069069 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcrr5\" (UniqueName: \"kubernetes.io/projected/e848ffb5-3244-4b27-a090-c63685145174-kube-api-access-dcrr5\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069087 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b2932cf-321f-41d9-b5f7-e9969592da8d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7njlp\" (UID: \"7b2932cf-321f-41d9-b5f7-e9969592da8d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/971714d1-d7ca-458a-98a3-7f0172c2e3c1-metrics-tls\") pod \"dns-operator-744455d44c-zwnt5\" (UID: \"971714d1-d7ca-458a-98a3-7f0172c2e3c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069135 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-etcd-client\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069150 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069180 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-metrics-certs\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069197 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-serving-cert\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgczg\" (UniqueName: \"kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069240 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069273 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069288 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-image-import-ca\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069307 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069340 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6003c01-a0dc-4474-88f8-cf178d542b63-config\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069356 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-trusted-ca\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069374 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069375 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/1075512e-3cc0-40fe-938c-07f5baaf964e-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069390 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-etcd-client\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069423 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069440 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6003c01-a0dc-4474-88f8-cf178d542b63-auth-proxy-config\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069458 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbbq\" (UniqueName: \"kubernetes.io/projected/5f326088-a1d0-43ee-9b2a-41e7ac797679-kube-api-access-5nbbq\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-audit\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-etcd-serving-ca\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ck2p\" (UniqueName: \"kubernetes.io/projected/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-kube-api-access-5ck2p\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069547 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069577 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-config\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069594 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmmpd\" (UniqueName: \"kubernetes.io/projected/38eeb4e2-2d0a-43c3-b305-aa464ec83096-kube-api-access-mmmpd\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069630 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjs7r\" (UniqueName: \"kubernetes.io/projected/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-kube-api-access-sjs7r\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.069651 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-stats-auth\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.072728 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-ca\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.072869 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-config\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.073141 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.073334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-service-ca\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.073357 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.074502 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.075236 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.075854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.076402 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96905ba5-6042-4555-aad7-0ac5abb5e6e2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.076460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38eeb4e2-2d0a-43c3-b305-aa464ec83096-audit-dir\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.076933 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-client-ca\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.076981 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a6003c01-a0dc-4474-88f8-cf178d542b63-machine-approver-tls\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.077204 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.077569 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a6003c01-a0dc-4474-88f8-cf178d542b63-auth-proxy-config\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.078651 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6003c01-a0dc-4474-88f8-cf178d542b63-config\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.078839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.078938 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.079201 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.079387 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-etcd-client\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.079920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-trusted-ca\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.080565 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-config\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.081104 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/971714d1-d7ca-458a-98a3-7f0172c2e3c1-metrics-tls\") pod \"dns-operator-744455d44c-zwnt5\" (UID: \"971714d1-d7ca-458a-98a3-7f0172c2e3c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.081476 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-etcd-client\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.081799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38eeb4e2-2d0a-43c3-b305-aa464ec83096-audit-policies\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.082666 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-serving-cert\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.083472 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-serving-cert\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.087231 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-encryption-config\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.087773 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7bc95114-073a-45f7-bf10-c008e08c6e52-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-v4spf\" (UID: \"7bc95114-073a-45f7-bf10-c008e08c6e52\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.090216 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.090683 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.090977 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.091195 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.091227 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.091216 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.101752 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1075512e-3cc0-40fe-938c-07f5baaf964e-serving-cert\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.102638 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96905ba5-6042-4555-aad7-0ac5abb5e6e2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.103396 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.103940 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.106893 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bjv8b"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.107175 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.107566 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.108106 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.109068 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/38eeb4e2-2d0a-43c3-b305-aa464ec83096-serving-cert\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.110558 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.111600 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-p7f84"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.112354 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.116236 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.116805 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.117262 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.117345 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.117888 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.118890 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.119260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-td2mw"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.120183 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.120342 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sztwq"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.121435 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7njlp"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.121973 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.122875 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtc5"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.124158 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.125472 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.127659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.128823 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.130329 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.131744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.133167 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-56687"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.134375 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hm822"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.135571 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.137005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.138803 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b72ms"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.139930 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bjv8b"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.140786 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.141567 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.142464 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.144067 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.145638 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-td2mw"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.147083 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.148374 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.149918 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dtqsp"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.151270 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rzzcv"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.151854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.152449 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-558p9"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.153056 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-558p9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.153599 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.154680 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.156569 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-p7f84"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.157483 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.158792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.160166 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.161099 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.161598 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xr85z"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.162923 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.163988 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rzzcv"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.165188 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-558p9"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.166166 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dq4bf"] Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.166808 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171158 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171191 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-image-import-ca\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171226 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-audit\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171245 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-etcd-client\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171287 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-etcd-serving-ca\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171317 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-stats-auth\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171345 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjs7r\" (UniqueName: \"kubernetes.io/projected/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-kube-api-access-sjs7r\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171364 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e2325-09e6-41a0-a0ad-67c3e01d2627-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171388 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e848ffb5-3244-4b27-a090-c63685145174-audit-dir\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171413 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxvcz\" (UniqueName: \"kubernetes.io/projected/b30e2325-09e6-41a0-a0ad-67c3e01d2627-kube-api-access-vxvcz\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171432 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e848ffb5-3244-4b27-a090-c63685145174-node-pullsecrets\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpfsz\" (UniqueName: \"kubernetes.io/projected/7b2932cf-321f-41d9-b5f7-e9969592da8d-kube-api-access-lpfsz\") pod \"multus-admission-controller-857f4d67dd-7njlp\" (UID: \"7b2932cf-321f-41d9-b5f7-e9969592da8d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171472 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171536 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-config\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171586 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8fjk\" (UniqueName: \"kubernetes.io/projected/4d85180d-59c0-4f8f-8481-170f27db08b6-kube-api-access-z8fjk\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e848ffb5-3244-4b27-a090-c63685145174-node-pullsecrets\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-config\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171659 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d85180d-59c0-4f8f-8481-170f27db08b6-service-ca-bundle\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171725 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b30e2325-09e6-41a0-a0ad-67c3e01d2627-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171755 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171797 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-encryption-config\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171822 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-default-certificate\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-serving-cert\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171892 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-serving-cert\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcrr5\" (UniqueName: \"kubernetes.io/projected/e848ffb5-3244-4b27-a090-c63685145174-kube-api-access-dcrr5\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b2932cf-321f-41d9-b5f7-e9969592da8d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7njlp\" (UID: \"7b2932cf-321f-41d9-b5f7-e9969592da8d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.171979 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-metrics-certs\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.172431 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b30e2325-09e6-41a0-a0ad-67c3e01d2627-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.172495 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e848ffb5-3244-4b27-a090-c63685145174-audit-dir\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.172527 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-audit\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.173598 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-config\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.173733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-service-ca-bundle\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.173903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-etcd-serving-ca\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.174266 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-image-import-ca\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.174287 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.175046 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.175864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b30e2325-09e6-41a0-a0ad-67c3e01d2627-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.176397 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-serving-cert\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.177386 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-etcd-client\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.179581 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-encryption-config\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.181218 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.184206 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e848ffb5-3244-4b27-a090-c63685145174-config\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.201519 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.207195 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e848ffb5-3244-4b27-a090-c63685145174-serving-cert\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.252092 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.261957 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.281757 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.301662 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.306634 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7b2932cf-321f-41d9-b5f7-e9969592da8d-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-7njlp\" (UID: \"7b2932cf-321f-41d9-b5f7-e9969592da8d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.321726 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.341873 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.362402 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.382446 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.402386 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.406809 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.422611 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.433193 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.442681 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.462212 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.482339 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.502084 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.522474 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.541922 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.549192 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-default-certificate\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.561761 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.575283 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-stats-auth\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.582029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.585473 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4d85180d-59c0-4f8f-8481-170f27db08b6-metrics-certs\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.601152 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.604798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4d85180d-59c0-4f8f-8481-170f27db08b6-service-ca-bundle\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.620756 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.641605 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.661769 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.680835 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.702013 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.721399 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.743195 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.762606 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.802504 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.822122 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.840824 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.861524 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.881996 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.902443 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.922244 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.942344 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.971193 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 17:20:01 crc kubenswrapper[4778]: I0930 17:20:01.981717 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.019044 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkph7\" (UniqueName: \"kubernetes.io/projected/a6003c01-a0dc-4474-88f8-cf178d542b63-kube-api-access-nkph7\") pod \"machine-approver-56656f9798-rqn89\" (UID: \"a6003c01-a0dc-4474-88f8-cf178d542b63\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.037868 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdtxj\" (UniqueName: \"kubernetes.io/projected/96905ba5-6042-4555-aad7-0ac5abb5e6e2-kube-api-access-tdtxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-vp8fx\" (UID: \"96905ba5-6042-4555-aad7-0ac5abb5e6e2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.059949 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gx2\" (UniqueName: \"kubernetes.io/projected/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-kube-api-access-48gx2\") pod \"route-controller-manager-6576b87f9c-kcflv\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.070000 4778 secret.go:188] Couldn't get secret openshift-console/console-oauth-config: failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.070100 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.570077917 +0000 UTC m=+141.559975720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-oauth-config" (UniqueName: "kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.072519 4778 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.072706 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config podName:5f326088-a1d0-43ee-9b2a-41e7ac797679 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.572676947 +0000 UTC m=+141.562574920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config") pod "console-operator-58897d9998-hm822" (UID: "5f326088-a1d0-43ee-9b2a-41e7ac797679") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.074550 4778 secret.go:188] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.074613 4778 configmap.go:193] Couldn't get configMap openshift-console/oauth-serving-cert: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.074904 4778 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.074649 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f326088-a1d0-43ee-9b2a-41e7ac797679-serving-cert podName:5f326088-a1d0-43ee-9b2a-41e7ac797679 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.574608212 +0000 UTC m=+141.564506225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/5f326088-a1d0-43ee-9b2a-41e7ac797679-serving-cert") pod "console-operator-58897d9998-hm822" (UID: "5f326088-a1d0-43ee-9b2a-41e7ac797679") : failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.075050 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.575004477 +0000 UTC m=+141.564902310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "oauth-serving-cert" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.075085 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca podName:c1da094b-f44d-4d9d-9fe8-3d19ea244d09 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.575068589 +0000 UTC m=+141.564966422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca") pod "controller-manager-879f6c89f-wxgcd" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.075127 4778 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.075185 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.575172684 +0000 UTC m=+141.565070517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.076950 4778 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.077012 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config podName:c1da094b-f44d-4d9d-9fe8-3d19ea244d09 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.576995663 +0000 UTC m=+141.566893656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config") pod "controller-manager-879f6c89f-wxgcd" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.077036 4778 configmap.go:193] Couldn't get configMap openshift-console/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.077148 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.577105428 +0000 UTC m=+141.567003391 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.077153 4778 configmap.go:193] Couldn't get configMap openshift-console/console-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.077209 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.577190051 +0000 UTC m=+141.567088074 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-config" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.078152 4778 secret.go:188] Couldn't get secret openshift-console/console-serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.078216 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.57820083 +0000 UTC m=+141.568098833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "console-serving-cert" (UniqueName: "kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.079587 4778 request.go:700] Waited for 1.003577445s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns-operator/serviceaccounts/dns-operator/token Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.079587 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9vwx\" (UniqueName: \"kubernetes.io/projected/1075512e-3cc0-40fe-938c-07f5baaf964e-kube-api-access-r9vwx\") pod \"openshift-config-operator-7777fb866f-k755p\" (UID: \"1075512e-3cc0-40fe-938c-07f5baaf964e\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.079683 4778 secret.go:188] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: E0930 17:20:02.079889 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert podName:c1da094b-f44d-4d9d-9fe8-3d19ea244d09 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:02.579866064 +0000 UTC m=+141.569764027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert") pod "controller-manager-879f6c89f-wxgcd" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09") : failed to sync secret cache: timed out waiting for the condition Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.102760 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.108121 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ktmg\" (UniqueName: \"kubernetes.io/projected/971714d1-d7ca-458a-98a3-7f0172c2e3c1-kube-api-access-7ktmg\") pod \"dns-operator-744455d44c-zwnt5\" (UID: \"971714d1-d7ca-458a-98a3-7f0172c2e3c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:02 crc kubenswrapper[4778]: W0930 17:20:02.130139 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6003c01_a0dc_4474_88f8_cf178d542b63.slice/crio-5cc5d038a12d7874069e9a0ef57e57f81f2de437ee3e2721a51f03a484821293 WatchSource:0}: Error finding container 5cc5d038a12d7874069e9a0ef57e57f81f2de437ee3e2721a51f03a484821293: Status 404 returned error can't find the container with id 5cc5d038a12d7874069e9a0ef57e57f81f2de437ee3e2721a51f03a484821293 Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.134939 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.137195 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n44qt\" (UniqueName: \"kubernetes.io/projected/7bc95114-073a-45f7-bf10-c008e08c6e52-kube-api-access-n44qt\") pod \"cluster-samples-operator-665b6dd947-v4spf\" (UID: \"7bc95114-073a-45f7-bf10-c008e08c6e52\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.141749 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.162396 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.182244 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.187163 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.201315 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.222736 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.272452 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ck2p\" (UniqueName: \"kubernetes.io/projected/4f2a6b6e-ec6e-491b-b997-10f2435e42a4-kube-api-access-5ck2p\") pod \"etcd-operator-b45778765-nprqp\" (UID: \"4f2a6b6e-ec6e-491b-b997-10f2435e42a4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.289443 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmmpd\" (UniqueName: \"kubernetes.io/projected/38eeb4e2-2d0a-43c3-b305-aa464ec83096-kube-api-access-mmmpd\") pod \"apiserver-7bbb656c7d-pkzrb\" (UID: \"38eeb4e2-2d0a-43c3-b305-aa464ec83096\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.310423 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.319197 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.321278 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.328086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.334734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.344015 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.353155 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-zwnt5"] Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.361695 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 17:20:02 crc kubenswrapper[4778]: W0930 17:20:02.364691 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod971714d1_d7ca_458a_98a3_7f0172c2e3c1.slice/crio-113eb1eea10b3c58c8b64c5a91556a647d93e55907b310f64891339ae2b8af98 WatchSource:0}: Error finding container 113eb1eea10b3c58c8b64c5a91556a647d93e55907b310f64891339ae2b8af98: Status 404 returned error can't find the container with id 113eb1eea10b3c58c8b64c5a91556a647d93e55907b310f64891339ae2b8af98 Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.381970 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.385409 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv"] Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.401267 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.414083 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.421719 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.442527 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.462367 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.482311 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.502333 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.515991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" event={"ID":"637f30f0-8906-4b9b-bfa8-b356ee2f88d9","Type":"ContainerStarted","Data":"d8affaeddf0425e1f3a90b0165045686c3cbbfe4e9954a67dbf1b63cb442bf50"} Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.516063 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k755p"] Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.517625 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" event={"ID":"971714d1-d7ca-458a-98a3-7f0172c2e3c1","Type":"ContainerStarted","Data":"113eb1eea10b3c58c8b64c5a91556a647d93e55907b310f64891339ae2b8af98"} Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.519409 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" event={"ID":"a6003c01-a0dc-4474-88f8-cf178d542b63","Type":"ContainerStarted","Data":"a0f7ea1690f263ecec5052318766278e07e67201d3fbb4fa9b3389a626423a6d"} Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.519434 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" event={"ID":"a6003c01-a0dc-4474-88f8-cf178d542b63","Type":"ContainerStarted","Data":"5cc5d038a12d7874069e9a0ef57e57f81f2de437ee3e2721a51f03a484821293"} Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.522805 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.541488 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.563320 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.582578 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.589976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590074 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f326088-a1d0-43ee-9b2a-41e7ac797679-serving-cert\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590444 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590482 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.590525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.602265 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.621839 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.644242 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.669550 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.680511 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.690944 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-nprqp"] Sep 30 17:20:02 crc kubenswrapper[4778]: W0930 17:20:02.699351 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2a6b6e_ec6e_491b_b997_10f2435e42a4.slice/crio-5be9567d4f8206cace6f1a3a31f3726eae8d05c04a2d715d61b13fda1e921ab9 WatchSource:0}: Error finding container 5be9567d4f8206cace6f1a3a31f3726eae8d05c04a2d715d61b13fda1e921ab9: Status 404 returned error can't find the container with id 5be9567d4f8206cace6f1a3a31f3726eae8d05c04a2d715d61b13fda1e921ab9 Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.700991 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.721597 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.741897 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.763528 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.777634 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx"] Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.783639 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.784056 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf"] Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.793846 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb"] Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.803041 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.821927 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 17:20:02 crc kubenswrapper[4778]: W0930 17:20:02.830108 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38eeb4e2_2d0a_43c3_b305_aa464ec83096.slice/crio-28adcfc15de479c2beeb59409e588c1679fb721b6cdf78d5623406f3eacc4673 WatchSource:0}: Error finding container 28adcfc15de479c2beeb59409e588c1679fb721b6cdf78d5623406f3eacc4673: Status 404 returned error can't find the container with id 28adcfc15de479c2beeb59409e588c1679fb721b6cdf78d5623406f3eacc4673 Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.841496 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.862758 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.882322 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.902745 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.921775 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.941841 4778 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.961381 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 17:20:02 crc kubenswrapper[4778]: I0930 17:20:02.981756 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.001750 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.021830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.042551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.062421 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.081935 4778 request.go:700] Waited for 1.926954733s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.084749 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.101863 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.116075 4778 projected.go:288] Couldn't get configMap openshift-controller-manager/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.121524 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.141266 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.163029 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.202503 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjs7r\" (UniqueName: \"kubernetes.io/projected/3189a025-d8d6-42bb-9e59-81c4dc54c5f2-kube-api-access-sjs7r\") pod \"authentication-operator-69f744f599-sztwq\" (UID: \"3189a025-d8d6-42bb-9e59-81c4dc54c5f2\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.231277 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpfsz\" (UniqueName: \"kubernetes.io/projected/7b2932cf-321f-41d9-b5f7-e9969592da8d-kube-api-access-lpfsz\") pod \"multus-admission-controller-857f4d67dd-7njlp\" (UID: \"7b2932cf-321f-41d9-b5f7-e9969592da8d\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.238465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxvcz\" (UniqueName: \"kubernetes.io/projected/b30e2325-09e6-41a0-a0ad-67c3e01d2627-kube-api-access-vxvcz\") pod \"openshift-apiserver-operator-796bbdcf4f-kghfs\" (UID: \"b30e2325-09e6-41a0-a0ad-67c3e01d2627\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.257710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8fjk\" (UniqueName: \"kubernetes.io/projected/4d85180d-59c0-4f8f-8481-170f27db08b6-kube-api-access-z8fjk\") pod \"router-default-5444994796-svds2\" (UID: \"4d85180d-59c0-4f8f-8481-170f27db08b6\") " pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.269157 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.280876 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-dkph9\" (UID: \"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.294221 4778 projected.go:288] Couldn't get configMap openshift-console-operator/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.294278 4778 projected.go:194] Error preparing data for projected volume kube-api-access-5nbbq for pod openshift-console-operator/console-operator-58897d9998-hm822: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.294402 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5f326088-a1d0-43ee-9b2a-41e7ac797679-kube-api-access-5nbbq podName:5f326088-a1d0-43ee-9b2a-41e7ac797679 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:03.794361199 +0000 UTC m=+142.784259002 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5nbbq" (UniqueName: "kubernetes.io/projected/5f326088-a1d0-43ee-9b2a-41e7ac797679-kube-api-access-5nbbq") pod "console-operator-58897d9998-hm822" (UID: "5f326088-a1d0-43ee-9b2a-41e7ac797679") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.296122 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.305159 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcrr5\" (UniqueName: \"kubernetes.io/projected/e848ffb5-3244-4b27-a090-c63685145174-kube-api-access-dcrr5\") pod \"apiserver-76f77b778f-b72ms\" (UID: \"e848ffb5-3244-4b27-a090-c63685145174\") " pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.315211 4778 projected.go:288] Couldn't get configMap openshift-console/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.318606 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.321715 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.327800 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.333008 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.343264 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.357353 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.382240 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.400610 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0de525d2-5fd2-4fd3-9524-3a5505955417-images\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401065 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/452a4879-b0bd-490e-bffb-25f8404a6eac-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401095 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d4ed97-8e17-4eea-9774-19034031f4dc-config\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401117 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8wz\" (UniqueName: \"kubernetes.io/projected/22eb8492-05be-4e05-a4b8-34f965c014ed-kube-api-access-nf8wz\") pod \"downloads-7954f5f757-xr85z\" (UID: \"22eb8492-05be-4e05-a4b8-34f965c014ed\") " pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401152 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-certificates\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401169 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b2ed2e0-659c-4371-bb93-039914cfa0b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401197 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401251 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q7dx\" (UniqueName: \"kubernetes.io/projected/0de525d2-5fd2-4fd3-9524-3a5505955417-kube-api-access-6q7dx\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401287 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-trusted-ca\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401325 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-dir\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401350 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401385 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2ed2e0-659c-4371-bb93-039914cfa0b8-config\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401452 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0de525d2-5fd2-4fd3-9524-3a5505955417-config\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401478 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401495 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e054b3bf-2796-4649-a9a0-8d19ea90412b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e054b3bf-2796-4649-a9a0-8d19ea90412b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401538 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-policies\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401557 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401572 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d4ed97-8e17-4eea-9774-19034031f4dc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401612 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fggq2\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-kube-api-access-fggq2\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401673 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401691 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e054b3bf-2796-4649-a9a0-8d19ea90412b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401720 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b2ed2e0-659c-4371-bb93-039914cfa0b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401738 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg8p\" (UniqueName: \"kubernetes.io/projected/e6175301-9b87-47be-8c95-a4ce7fa0a413-kube-api-access-bfg8p\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.401982 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs866\" (UniqueName: \"kubernetes.io/projected/e054b3bf-2796-4649-a9a0-8d19ea90412b-kube-api-access-fs866\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.402112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d4ed97-8e17-4eea-9774-19034031f4dc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.402159 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:03.902146188 +0000 UTC m=+142.892043991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.402195 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/452a4879-b0bd-490e-bffb-25f8404a6eac-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.402214 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.403528 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.403578 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0de525d2-5fd2-4fd3-9524-3a5505955417-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.403857 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-tls\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.403896 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-bound-sa-token\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.403928 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.403942 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.406189 4778 projected.go:194] Error preparing data for projected volume kube-api-access-96v5t for pod openshift-controller-manager/controller-manager-879f6c89f-wxgcd: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.406269 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t podName:c1da094b-f44d-4d9d-9fe8-3d19ea244d09 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:03.906245747 +0000 UTC m=+142.896143550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-96v5t" (UniqueName: "kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t") pod "controller-manager-879f6c89f-wxgcd" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.421510 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.423897 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.443260 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.465677 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.482852 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.485824 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f326088-a1d0-43ee-9b2a-41e7ac797679-serving-cert\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.492537 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.501807 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505525 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505803 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d4ed97-8e17-4eea-9774-19034031f4dc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505856 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-serving-cert\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505899 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/452a4879-b0bd-490e-bffb-25f8404a6eac-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505923 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ca171-a2bb-4da3-9630-31d12f416ae9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505955 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-registration-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.505978 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-srv-cert\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506013 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506036 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0de525d2-5fd2-4fd3-9524-3a5505955417-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506116 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-bound-sa-token\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506140 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-metrics-tls\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0de525d2-5fd2-4fd3-9524-3a5505955417-images\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506188 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnn4q\" (UniqueName: \"kubernetes.io/projected/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-kube-api-access-xnn4q\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506253 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/452a4879-b0bd-490e-bffb-25f8404a6eac-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506276 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88gdp\" (UniqueName: \"kubernetes.io/projected/45e7824d-5ef7-4454-bdb7-8a02b52c6c45-kube-api-access-88gdp\") pod \"migrator-59844c95c7-nfjrg\" (UID: \"45e7824d-5ef7-4454-bdb7-8a02b52c6c45\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506320 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-plugins-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbkm9\" (UniqueName: \"kubernetes.io/projected/cc7a228b-7187-412c-987d-696193bfce29-kube-api-access-qbkm9\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f87n\" (UniqueName: \"kubernetes.io/projected/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-kube-api-access-4f87n\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506443 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8wz\" (UniqueName: \"kubernetes.io/projected/22eb8492-05be-4e05-a4b8-34f965c014ed-kube-api-access-nf8wz\") pod \"downloads-7954f5f757-xr85z\" (UID: \"22eb8492-05be-4e05-a4b8-34f965c014ed\") " pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506521 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506548 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6rz9\" (UniqueName: \"kubernetes.io/projected/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-kube-api-access-x6rz9\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506574 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b2ed2e0-659c-4371-bb93-039914cfa0b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-certificates\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506682 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506718 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q7dx\" (UniqueName: \"kubernetes.io/projected/0de525d2-5fd2-4fd3-9524-3a5505955417-kube-api-access-6q7dx\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-node-bootstrap-token\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506766 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506791 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc7a228b-7187-412c-987d-696193bfce29-images\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506815 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmgz\" (UniqueName: \"kubernetes.io/projected/385ca6a0-940d-409a-a0aa-b22ab8920177-kube-api-access-6vmgz\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506839 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-csi-data-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506889 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-dir\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506924 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jql\" (UniqueName: \"kubernetes.io/projected/06cbc897-4734-4d93-8cfa-75dc0b31cc57-kube-api-access-26jql\") pod \"ingress-canary-rzzcv\" (UID: \"06cbc897-4734-4d93-8cfa-75dc0b31cc57\") " pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.506975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/113264a3-0486-4bbf-bb1f-1c7494dbfeea-profile-collector-cert\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507016 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwxch\" (UniqueName: \"kubernetes.io/projected/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-kube-api-access-lwxch\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507039 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b85a8ab-2235-4f6f-826f-22f354550dfb-trusted-ca\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e054b3bf-2796-4649-a9a0-8d19ea90412b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507099 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc7a228b-7187-412c-987d-696193bfce29-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507124 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8d87\" (UniqueName: \"kubernetes.io/projected/f4ed657a-f56f-4237-a9bb-03aad72e437d-kube-api-access-w8d87\") pod \"package-server-manager-789f6589d5-klvfk\" (UID: \"f4ed657a-f56f-4237-a9bb-03aad72e437d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fggq2\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-kube-api-access-fggq2\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507220 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ca171-a2bb-4da3-9630-31d12f416ae9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b42nv\" (UniqueName: \"kubernetes.io/projected/299ca171-a2bb-4da3-9630-31d12f416ae9-kube-api-access-b42nv\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507288 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507314 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg8p\" (UniqueName: \"kubernetes.io/projected/e6175301-9b87-47be-8c95-a4ce7fa0a413-kube-api-access-bfg8p\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507337 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs866\" (UniqueName: \"kubernetes.io/projected/e054b3bf-2796-4649-a9a0-8d19ea90412b-kube-api-access-fs866\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507363 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-certs\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507387 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/113264a3-0486-4bbf-bb1f-1c7494dbfeea-srv-cert\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507411 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce48deed-422c-411b-946d-30a87d293815-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b8229\" (UID: \"ce48deed-422c-411b-946d-30a87d293815\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507434 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b85a8ab-2235-4f6f-826f-22f354550dfb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507463 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507491 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc7a228b-7187-412c-987d-696193bfce29-proxy-tls\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507514 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lh6m\" (UniqueName: \"kubernetes.io/projected/6b13f665-8a05-4466-a03b-511d2456f1ca-kube-api-access-7lh6m\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507562 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-tls\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507600 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507657 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bjjs\" (UniqueName: \"kubernetes.io/projected/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-kube-api-access-5bjjs\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507685 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b85a8ab-2235-4f6f-826f-22f354550dfb-metrics-tls\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507713 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507757 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d4ed97-8e17-4eea-9774-19034031f4dc-config\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507784 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkmt\" (UniqueName: \"kubernetes.io/projected/535566fd-2a2e-43a4-94cb-dea8d1e2123b-kube-api-access-7tkmt\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507810 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-signing-key\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507886 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw47d\" (UniqueName: \"kubernetes.io/projected/ce48deed-422c-411b-946d-30a87d293815-kube-api-access-gw47d\") pod \"control-plane-machine-set-operator-78cbb6b69f-b8229\" (UID: \"ce48deed-422c-411b-946d-30a87d293815\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507927 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535566fd-2a2e-43a4-94cb-dea8d1e2123b-secret-volume\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507952 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.507980 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508004 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-proxy-tls\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508027 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-config\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508054 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dxr7\" (UniqueName: \"kubernetes.io/projected/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-kube-api-access-6dxr7\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508079 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ed657a-f56f-4237-a9bb-03aad72e437d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-klvfk\" (UID: \"f4ed657a-f56f-4237-a9bb-03aad72e437d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508103 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535566fd-2a2e-43a4-94cb-dea8d1e2123b-config-volume\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-config-volume\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508198 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508238 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a782c417-1524-4002-9d72-62747e465267-apiservice-cert\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508274 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-trusted-ca\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508302 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-mountpoint-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508328 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2ed2e0-659c-4371-bb93-039914cfa0b8-config\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508353 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg27x\" (UniqueName: \"kubernetes.io/projected/113264a3-0486-4bbf-bb1f-1c7494dbfeea-kube-api-access-rg27x\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttd8c\" (UniqueName: \"kubernetes.io/projected/a782c417-1524-4002-9d72-62747e465267-kube-api-access-ttd8c\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508452 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0de525d2-5fd2-4fd3-9524-3a5505955417-config\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skdds\" (UniqueName: \"kubernetes.io/projected/5b85a8ab-2235-4f6f-826f-22f354550dfb-kube-api-access-skdds\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508517 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508542 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a782c417-1524-4002-9d72-62747e465267-tmpfs\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508582 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e054b3bf-2796-4649-a9a0-8d19ea90412b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508641 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-policies\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508668 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06cbc897-4734-4d93-8cfa-75dc0b31cc57-cert\") pod \"ingress-canary-rzzcv\" (UID: \"06cbc897-4734-4d93-8cfa-75dc0b31cc57\") " pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508724 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d4ed97-8e17-4eea-9774-19034031f4dc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508748 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-socket-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508770 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-signing-cabundle\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508798 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a782c417-1524-4002-9d72-62747e465267-webhook-cert\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e054b3bf-2796-4649-a9a0-8d19ea90412b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.508882 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b2ed2e0-659c-4371-bb93-039914cfa0b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.509748 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.00972583 +0000 UTC m=+142.999623633 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.512159 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.514243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0de525d2-5fd2-4fd3-9524-3a5505955417-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.515717 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0de525d2-5fd2-4fd3-9524-3a5505955417-images\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.516344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-dir\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.516967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/452a4879-b0bd-490e-bffb-25f8404a6eac-ca-trust-extracted\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.519627 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.520168 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89d4ed97-8e17-4eea-9774-19034031f4dc-config\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.520705 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-policies\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.521255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0de525d2-5fd2-4fd3-9524-3a5505955417-config\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.523967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-trusted-ca\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.525003 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e054b3bf-2796-4649-a9a0-8d19ea90412b-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.526130 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.526997 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.527281 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b2ed2e0-659c-4371-bb93-039914cfa0b8-config\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.528034 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-certificates\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.528081 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.528400 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.531028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.532885 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.534325 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b2ed2e0-659c-4371-bb93-039914cfa0b8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.535125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.546826 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.550182 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.550535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.550766 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89d4ed97-8e17-4eea-9774-19034031f4dc-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.551033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/452a4879-b0bd-490e-bffb-25f8404a6eac-installation-pull-secrets\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.551911 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.552463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-tls\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.556155 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/e054b3bf-2796-4649-a9a0-8d19ea90412b-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.557335 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.557678 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.559461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" event={"ID":"637f30f0-8906-4b9b-bfa8-b356ee2f88d9","Type":"ContainerStarted","Data":"1e67e542db52deba749e9acbb733ab4e6200ceaa0286da938afeaf51646f196f"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.560164 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.561004 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.563387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" event={"ID":"971714d1-d7ca-458a-98a3-7f0172c2e3c1","Type":"ContainerStarted","Data":"7f974f209744d9ccca9b82e3ac47a8dd87ef241e44c19da6be75436c5a9b8bb8"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.563436 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" event={"ID":"971714d1-d7ca-458a-98a3-7f0172c2e3c1","Type":"ContainerStarted","Data":"f444d6af38257effd1ee54cf2dc0020bccfc0feff8b3c93189581e11a94483ce"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.572365 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" event={"ID":"a6003c01-a0dc-4474-88f8-cf178d542b63","Type":"ContainerStarted","Data":"6774918cf6384af45cddba0042ee22952730adc60760892290b7b4a6dd40a7d2"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.572670 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.574884 4778 generic.go:334] "Generic (PLEG): container finished" podID="1075512e-3cc0-40fe-938c-07f5baaf964e" containerID="b6fba158f7929174eec3116be9d67ca285ba8faf1e163fc6579affb77f6f1d1c" exitCode=0 Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.574963 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" event={"ID":"1075512e-3cc0-40fe-938c-07f5baaf964e","Type":"ContainerDied","Data":"b6fba158f7929174eec3116be9d67ca285ba8faf1e163fc6579affb77f6f1d1c"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.575009 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" event={"ID":"1075512e-3cc0-40fe-938c-07f5baaf964e","Type":"ContainerStarted","Data":"05d8a14c2d61077bef29bf45426c9b376fb254c4c99c9d471c2d04b81b7580d8"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.582124 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.583668 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" event={"ID":"7bc95114-073a-45f7-bf10-c008e08c6e52","Type":"ContainerStarted","Data":"a15e18d49d5a785c60c2494e1891e8a99b9bb2a3cdb729a5ddbee2c58a32c962"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.583785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" event={"ID":"7bc95114-073a-45f7-bf10-c008e08c6e52","Type":"ContainerStarted","Data":"d7f0e1c0dfa3d1282b1d12283fa56f43629170f1ba9080f4f1deff869484f88e"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.583940 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" event={"ID":"7bc95114-073a-45f7-bf10-c008e08c6e52","Type":"ContainerStarted","Data":"68831f61e3908d99c49c66e8bef37b2ea8378e1f11a31f5de73a1da5d822c80d"} Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.585339 4778 projected.go:194] Error preparing data for projected volume kube-api-access-jgczg for pod openshift-console/console-f9d7485db-pl4fp: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.585456 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.085438105 +0000 UTC m=+143.075335908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jgczg" (UniqueName: "kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.586546 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs"] Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600407 4778 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600490 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config podName:5f326088-a1d0-43ee-9b2a-41e7ac797679 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.600468344 +0000 UTC m=+143.590366147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config") pod "console-operator-58897d9998-hm822" (UID: "5f326088-a1d0-43ee-9b2a-41e7ac797679") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600658 4778 configmap.go:193] Couldn't get configMap openshift-console/oauth-serving-cert: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600693 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.600684062 +0000 UTC m=+143.590581865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "oauth-serving-cert" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600725 4778 configmap.go:193] Couldn't get configMap openshift-console/service-ca: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600747 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.600739254 +0000 UTC m=+143.590637057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600752 4778 configmap.go:193] Couldn't get configMap openshift-console/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.600837 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle podName:82aaff94-0c3c-4a1b-be0c-5371c3b60ab0 nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.600816977 +0000 UTC m=+143.590714770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle") pod "console-f9d7485db-pl4fp" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0") : failed to sync configmap cache: timed out waiting for the condition Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.603140 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.603746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.605307 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" event={"ID":"4f2a6b6e-ec6e-491b-b997-10f2435e42a4","Type":"ContainerStarted","Data":"b2dd627cfdabcf4cd01ea6472ee1d12c50a5cc425123f9954938c4a0b8b95fad"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.605337 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" event={"ID":"4f2a6b6e-ec6e-491b-b997-10f2435e42a4","Type":"ContainerStarted","Data":"5be9567d4f8206cace6f1a3a31f3726eae8d05c04a2d715d61b13fda1e921ab9"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610184 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-certs\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610240 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce48deed-422c-411b-946d-30a87d293815-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b8229\" (UID: \"ce48deed-422c-411b-946d-30a87d293815\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610264 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b85a8ab-2235-4f6f-826f-22f354550dfb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610283 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/113264a3-0486-4bbf-bb1f-1c7494dbfeea-srv-cert\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610298 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc7a228b-7187-412c-987d-696193bfce29-proxy-tls\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610315 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lh6m\" (UniqueName: \"kubernetes.io/projected/6b13f665-8a05-4466-a03b-511d2456f1ca-kube-api-access-7lh6m\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.610381 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bjjs\" (UniqueName: \"kubernetes.io/projected/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-kube-api-access-5bjjs\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.611795 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b85a8ab-2235-4f6f-826f-22f354550dfb-metrics-tls\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.611893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkmt\" (UniqueName: \"kubernetes.io/projected/535566fd-2a2e-43a4-94cb-dea8d1e2123b-kube-api-access-7tkmt\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.612051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-signing-key\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.612125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw47d\" (UniqueName: \"kubernetes.io/projected/ce48deed-422c-411b-946d-30a87d293815-kube-api-access-gw47d\") pod \"control-plane-machine-set-operator-78cbb6b69f-b8229\" (UID: \"ce48deed-422c-411b-946d-30a87d293815\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.612152 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535566fd-2a2e-43a4-94cb-dea8d1e2123b-secret-volume\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618563 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618636 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618670 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-config\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dxr7\" (UniqueName: \"kubernetes.io/projected/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-kube-api-access-6dxr7\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618712 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ed657a-f56f-4237-a9bb-03aad72e437d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-klvfk\" (UID: \"f4ed657a-f56f-4237-a9bb-03aad72e437d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618732 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-proxy-tls\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618752 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-config-volume\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618785 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535566fd-2a2e-43a4-94cb-dea8d1e2123b-config-volume\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618855 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a782c417-1524-4002-9d72-62747e465267-apiservice-cert\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618913 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-mountpoint-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg27x\" (UniqueName: \"kubernetes.io/projected/113264a3-0486-4bbf-bb1f-1c7494dbfeea-kube-api-access-rg27x\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618974 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttd8c\" (UniqueName: \"kubernetes.io/projected/a782c417-1524-4002-9d72-62747e465267-kube-api-access-ttd8c\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.618999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skdds\" (UniqueName: \"kubernetes.io/projected/5b85a8ab-2235-4f6f-826f-22f354550dfb-kube-api-access-skdds\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619041 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a782c417-1524-4002-9d72-62747e465267-tmpfs\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619080 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06cbc897-4734-4d93-8cfa-75dc0b31cc57-cert\") pod \"ingress-canary-rzzcv\" (UID: \"06cbc897-4734-4d93-8cfa-75dc0b31cc57\") " pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619116 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-socket-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619132 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-signing-cabundle\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a782c417-1524-4002-9d72-62747e465267-webhook-cert\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619251 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-serving-cert\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619279 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ca171-a2bb-4da3-9630-31d12f416ae9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-srv-cert\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619330 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-registration-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619363 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-metrics-tls\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619390 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnn4q\" (UniqueName: \"kubernetes.io/projected/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-kube-api-access-xnn4q\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619421 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88gdp\" (UniqueName: \"kubernetes.io/projected/45e7824d-5ef7-4454-bdb7-8a02b52c6c45-kube-api-access-88gdp\") pod \"migrator-59844c95c7-nfjrg\" (UID: \"45e7824d-5ef7-4454-bdb7-8a02b52c6c45\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619445 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619468 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-plugins-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619496 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbkm9\" (UniqueName: \"kubernetes.io/projected/cc7a228b-7187-412c-987d-696193bfce29-kube-api-access-qbkm9\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619519 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f87n\" (UniqueName: \"kubernetes.io/projected/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-kube-api-access-4f87n\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619562 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6rz9\" (UniqueName: \"kubernetes.io/projected/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-kube-api-access-x6rz9\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.619608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-node-bootstrap-token\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625443 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc7a228b-7187-412c-987d-696193bfce29-images\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625476 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-csi-data-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmgz\" (UniqueName: \"kubernetes.io/projected/385ca6a0-940d-409a-a0aa-b22ab8920177-kube-api-access-6vmgz\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625558 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625593 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jql\" (UniqueName: \"kubernetes.io/projected/06cbc897-4734-4d93-8cfa-75dc0b31cc57-kube-api-access-26jql\") pod \"ingress-canary-rzzcv\" (UID: \"06cbc897-4734-4d93-8cfa-75dc0b31cc57\") " pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625658 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/113264a3-0486-4bbf-bb1f-1c7494dbfeea-profile-collector-cert\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwxch\" (UniqueName: \"kubernetes.io/projected/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-kube-api-access-lwxch\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625709 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b85a8ab-2235-4f6f-826f-22f354550dfb-trusted-ca\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625748 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc7a228b-7187-412c-987d-696193bfce29-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625771 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8d87\" (UniqueName: \"kubernetes.io/projected/f4ed657a-f56f-4237-a9bb-03aad72e437d-kube-api-access-w8d87\") pod \"package-server-manager-789f6589d5-klvfk\" (UID: \"f4ed657a-f56f-4237-a9bb-03aad72e437d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ca171-a2bb-4da3-9630-31d12f416ae9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.625861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b42nv\" (UniqueName: \"kubernetes.io/projected/299ca171-a2bb-4da3-9630-31d12f416ae9-kube-api-access-b42nv\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.635073 4778 generic.go:334] "Generic (PLEG): container finished" podID="38eeb4e2-2d0a-43c3-b305-aa464ec83096" containerID="6c1e0eb4fc1a8c04c732c183223de68d423d49f8c25dac32d8136d6d0972dce3" exitCode=0 Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.635285 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" event={"ID":"38eeb4e2-2d0a-43c3-b305-aa464ec83096","Type":"ContainerDied","Data":"6c1e0eb4fc1a8c04c732c183223de68d423d49f8c25dac32d8136d6d0972dce3"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.635343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" event={"ID":"38eeb4e2-2d0a-43c3-b305-aa464ec83096","Type":"ContainerStarted","Data":"28adcfc15de479c2beeb59409e588c1679fb721b6cdf78d5623406f3eacc4673"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.636457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-certs\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.637736 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/cc7a228b-7187-412c-987d-696193bfce29-proxy-tls\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.638152 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-csi-data-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.638358 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-registration-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.640184 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-plugins-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.640472 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.640831 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-config\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.642272 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5b85a8ab-2235-4f6f-826f-22f354550dfb-trusted-ca\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.642496 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-7njlp"] Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.643655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.644321 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.644640 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.645206 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/cc7a228b-7187-412c-987d-696193bfce29-auth-proxy-config\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.647944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" event={"ID":"96905ba5-6042-4555-aad7-0ac5abb5e6e2","Type":"ContainerStarted","Data":"a2f48851cd72883383f4ebca32ec684f629e03fec426ba4d9abe60c3392ea1f9"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.669308 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" event={"ID":"96905ba5-6042-4555-aad7-0ac5abb5e6e2","Type":"ContainerStarted","Data":"2b8f18a3706165e2eee8fd40178f054ec72632b580bfe09069ebc35521836b58"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.651328 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ce48deed-422c-411b-946d-30a87d293815-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-b8229\" (UID: \"ce48deed-422c-411b-946d-30a87d293815\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.651747 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5b85a8ab-2235-4f6f-826f-22f354550dfb-metrics-tls\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.654368 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-mountpoint-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.655408 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6b13f665-8a05-4466-a03b-511d2456f1ca-socket-dir\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.656625 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/cc7a228b-7187-412c-987d-696193bfce29-images\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.657220 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-signing-cabundle\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.657738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-config-volume\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.658608 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/299ca171-a2bb-4da3-9630-31d12f416ae9-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.660032 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535566fd-2a2e-43a4-94cb-dea8d1e2123b-config-volume\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.665028 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-metrics-tls\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.665874 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-proxy-tls\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.653050 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.153014326 +0000 UTC m=+143.142912129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.650654 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/113264a3-0486-4bbf-bb1f-1c7494dbfeea-profile-collector-cert\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.670654 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/a782c417-1524-4002-9d72-62747e465267-tmpfs\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.673631 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-serving-cert\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.674542 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.677596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535566fd-2a2e-43a4-94cb-dea8d1e2123b-secret-volume\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.677818 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/113264a3-0486-4bbf-bb1f-1c7494dbfeea-srv-cert\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.678341 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.684958 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a782c417-1524-4002-9d72-62747e465267-webhook-cert\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.685361 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-svds2" event={"ID":"4d85180d-59c0-4f8f-8481-170f27db08b6","Type":"ContainerStarted","Data":"3c3173c7de6fcf96b49df87542d29d55812f7db34748a1d520782cc3a9d4391a"} Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.687177 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-signing-key\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.689698 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a782c417-1524-4002-9d72-62747e465267-apiservice-cert\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.689768 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.690411 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.690441 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/06cbc897-4734-4d93-8cfa-75dc0b31cc57-cert\") pod \"ingress-canary-rzzcv\" (UID: \"06cbc897-4734-4d93-8cfa-75dc0b31cc57\") " pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.691521 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/299ca171-a2bb-4da3-9630-31d12f416ae9-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.693321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f4ed657a-f56f-4237-a9bb-03aad72e437d-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-klvfk\" (UID: \"f4ed657a-f56f-4237-a9bb-03aad72e437d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.695421 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-srv-cert\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.698982 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-node-bootstrap-token\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.706249 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.711006 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9"] Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.727121 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.728233 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.228195541 +0000 UTC m=+143.218093404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.745666 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-sztwq"] Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.775800 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg8p\" (UniqueName: \"kubernetes.io/projected/e6175301-9b87-47be-8c95-a4ce7fa0a413-kube-api-access-bfg8p\") pod \"oauth-openshift-558db77b4-56687\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.779927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7b2ed2e0-659c-4371-bb93-039914cfa0b8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-95zm8\" (UID: \"7b2ed2e0-659c-4371-bb93-039914cfa0b8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.805464 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs866\" (UniqueName: \"kubernetes.io/projected/e054b3bf-2796-4649-a9a0-8d19ea90412b-kube-api-access-fs866\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.829823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nbbq\" (UniqueName: \"kubernetes.io/projected/5f326088-a1d0-43ee-9b2a-41e7ac797679-kube-api-access-5nbbq\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.830865 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.832452 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.332431733 +0000 UTC m=+143.322329536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.833677 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-bound-sa-token\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.842738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nbbq\" (UniqueName: \"kubernetes.io/projected/5f326088-a1d0-43ee-9b2a-41e7ac797679-kube-api-access-5nbbq\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.844656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q7dx\" (UniqueName: \"kubernetes.io/projected/0de525d2-5fd2-4fd3-9524-3a5505955417-kube-api-access-6q7dx\") pod \"machine-api-operator-5694c8668f-whtc5\" (UID: \"0de525d2-5fd2-4fd3-9524-3a5505955417\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.861565 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.867535 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8wz\" (UniqueName: \"kubernetes.io/projected/22eb8492-05be-4e05-a4b8-34f965c014ed-kube-api-access-nf8wz\") pod \"downloads-7954f5f757-xr85z\" (UID: \"22eb8492-05be-4e05-a4b8-34f965c014ed\") " pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.880669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e054b3bf-2796-4649-a9a0-8d19ea90412b-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ff94r\" (UID: \"e054b3bf-2796-4649-a9a0-8d19ea90412b\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.900337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fggq2\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-kube-api-access-fggq2\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.916680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.943019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/89d4ed97-8e17-4eea-9774-19034031f4dc-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6vxnc\" (UID: \"89d4ed97-8e17-4eea-9774-19034031f4dc\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.943331 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.943672 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.943841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96v5t\" (UniqueName: \"kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.943944 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.443921536 +0000 UTC m=+143.433819339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.943992 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:03 crc kubenswrapper[4778]: E0930 17:20:03.944531 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.444524699 +0000 UTC m=+143.434422502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.951769 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96v5t\" (UniqueName: \"kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t\") pod \"controller-manager-879f6c89f-wxgcd\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.964493 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.966352 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lh6m\" (UniqueName: \"kubernetes.io/projected/6b13f665-8a05-4466-a03b-511d2456f1ca-kube-api-access-7lh6m\") pod \"csi-hostpathplugin-td2mw\" (UID: \"6b13f665-8a05-4466-a03b-511d2456f1ca\") " pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.966880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bjjs\" (UniqueName: \"kubernetes.io/projected/9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c-kube-api-access-5bjjs\") pod \"service-ca-operator-777779d784-w8zpd\" (UID: \"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:03 crc kubenswrapper[4778]: I0930 17:20:03.978353 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b72ms"] Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.005574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw47d\" (UniqueName: \"kubernetes.io/projected/ce48deed-422c-411b-946d-30a87d293815-kube-api-access-gw47d\") pod \"control-plane-machine-set-operator-78cbb6b69f-b8229\" (UID: \"ce48deed-422c-411b-946d-30a87d293815\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.006310 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkmt\" (UniqueName: \"kubernetes.io/projected/535566fd-2a2e-43a4-94cb-dea8d1e2123b-kube-api-access-7tkmt\") pod \"collect-profiles-29320875-66s7c\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.020899 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.027447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b42nv\" (UniqueName: \"kubernetes.io/projected/299ca171-a2bb-4da3-9630-31d12f416ae9-kube-api-access-b42nv\") pod \"kube-storage-version-migrator-operator-b67b599dd-p6wqv\" (UID: \"299ca171-a2bb-4da3-9630-31d12f416ae9\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.036345 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.043848 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5b85a8ab-2235-4f6f-826f-22f354550dfb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.044789 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.045159 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.545129862 +0000 UTC m=+143.535027665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.045518 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.045998 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.545980364 +0000 UTC m=+143.535878167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.057958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.068911 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbkm9\" (UniqueName: \"kubernetes.io/projected/cc7a228b-7187-412c-987d-696193bfce29-kube-api-access-qbkm9\") pod \"machine-config-operator-74547568cd-dvxvh\" (UID: \"cc7a228b-7187-412c-987d-696193bfce29\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.082066 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.083121 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.088829 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnn4q\" (UniqueName: \"kubernetes.io/projected/ee173b5a-1b04-4447-ac5e-42d11cc5e80c-kube-api-access-xnn4q\") pod \"machine-config-controller-84d6567774-nz4ww\" (UID: \"ee173b5a-1b04-4447-ac5e-42d11cc5e80c\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.104944 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmgz\" (UniqueName: \"kubernetes.io/projected/385ca6a0-940d-409a-a0aa-b22ab8920177-kube-api-access-6vmgz\") pod \"marketplace-operator-79b997595-bjv8b\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.123069 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.124952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88gdp\" (UniqueName: \"kubernetes.io/projected/45e7824d-5ef7-4454-bdb7-8a02b52c6c45-kube-api-access-88gdp\") pod \"migrator-59844c95c7-nfjrg\" (UID: \"45e7824d-5ef7-4454-bdb7-8a02b52c6c45\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.143739 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.147549 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.147719 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.64768649 +0000 UTC m=+143.637584293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.147839 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.148032 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgczg\" (UniqueName: \"kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.148159 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f87n\" (UniqueName: \"kubernetes.io/projected/ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f-kube-api-access-4f87n\") pod \"service-ca-9c57cc56f-p7f84\" (UID: \"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f\") " pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.148540 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.648524653 +0000 UTC m=+143.638422456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.155107 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgczg\" (UniqueName: \"kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.161143 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.177497 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dxr7\" (UniqueName: \"kubernetes.io/projected/51bf38ac-54b4-4f70-b3bf-14a2fef26dcb-kube-api-access-6dxr7\") pod \"olm-operator-6b444d44fb-fn7v9\" (UID: \"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.179920 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6rz9\" (UniqueName: \"kubernetes.io/projected/2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3-kube-api-access-x6rz9\") pod \"dns-default-558p9\" (UID: \"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3\") " pod="openshift-dns/dns-default-558p9" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.208255 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jql\" (UniqueName: \"kubernetes.io/projected/06cbc897-4734-4d93-8cfa-75dc0b31cc57-kube-api-access-26jql\") pod \"ingress-canary-rzzcv\" (UID: \"06cbc897-4734-4d93-8cfa-75dc0b31cc57\") " pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.238753 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8d87\" (UniqueName: \"kubernetes.io/projected/f4ed657a-f56f-4237-a9bb-03aad72e437d-kube-api-access-w8d87\") pod \"package-server-manager-789f6589d5-klvfk\" (UID: \"f4ed657a-f56f-4237-a9bb-03aad72e437d\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.251484 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwxch\" (UniqueName: \"kubernetes.io/projected/1f2da339-d7c7-4caa-8eb2-aa2bcd178b50-kube-api-access-lwxch\") pod \"machine-config-server-dq4bf\" (UID: \"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50\") " pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.252188 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.252545 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.752495955 +0000 UTC m=+143.742393758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.252911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.253392 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.753376519 +0000 UTC m=+143.743274322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.277964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.293974 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skdds\" (UniqueName: \"kubernetes.io/projected/5b85a8ab-2235-4f6f-826f-22f354550dfb-kube-api-access-skdds\") pod \"ingress-operator-5b745b69d9-rfdt6\" (UID: \"5b85a8ab-2235-4f6f-826f-22f354550dfb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.304996 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttd8c\" (UniqueName: \"kubernetes.io/projected/a782c417-1524-4002-9d72-62747e465267-kube-api-access-ttd8c\") pod \"packageserver-d55dfcdfc-26ztf\" (UID: \"a782c417-1524-4002-9d72-62747e465267\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.306460 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg27x\" (UniqueName: \"kubernetes.io/projected/113264a3-0486-4bbf-bb1f-1c7494dbfeea-kube-api-access-rg27x\") pod \"catalog-operator-68c6474976-sr9l8\" (UID: \"113264a3-0486-4bbf-bb1f-1c7494dbfeea\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.308976 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.312836 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.327426 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.340267 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.346474 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.361438 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.364067 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.372131 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:04 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:04 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:04 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.372187 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.372423 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.375524 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.375684 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.875661176 +0000 UTC m=+143.865558979 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.375927 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.376341 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.876331262 +0000 UTC m=+143.866229065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.389559 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.433164 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rzzcv" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.444288 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-558p9" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.453544 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dq4bf" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.459414 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-56687"] Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.478230 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.478731 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:04.978698444 +0000 UTC m=+143.968596247 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.580124 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.581219 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.081193039 +0000 UTC m=+144.071091032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.588958 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.595252 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.614566 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-v4spf" podStartSLOduration=122.614535953 podStartE2EDuration="2m2.614535953s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:04.611304648 +0000 UTC m=+143.601202451" watchObservedRunningTime="2025-09-30 17:20:04.614535953 +0000 UTC m=+143.604433756" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.683169 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.734475 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.184952263 +0000 UTC m=+144.174850066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.741935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.742182 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.742933 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.742984 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.745474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.746762 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.750957 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.250923264 +0000 UTC m=+144.240821067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.751662 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8"] Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.752467 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.754944 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") pod \"console-f9d7485db-pl4fp\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.765560 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f326088-a1d0-43ee-9b2a-41e7ac797679-config\") pod \"console-operator-58897d9998-hm822\" (UID: \"5f326088-a1d0-43ee-9b2a-41e7ac797679\") " pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.775599 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-rqn89" podStartSLOduration=122.775555802 podStartE2EDuration="2m2.775555802s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:04.748641846 +0000 UTC m=+143.738539659" watchObservedRunningTime="2025-09-30 17:20:04.775555802 +0000 UTC m=+143.765453605" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.776632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-56687" event={"ID":"e6175301-9b87-47be-8c95-a4ce7fa0a413","Type":"ContainerStarted","Data":"5dfe25609eb4a37d12434609067eaeeb6504f9416e5b5ae024011a8da70405dd"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.790550 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" event={"ID":"38eeb4e2-2d0a-43c3-b305-aa464ec83096","Type":"ContainerStarted","Data":"a36d570bb431e670f1fa2174e650ca65c145ae12c7514f69d4c90bc5a9423cfb"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.801117 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" event={"ID":"e848ffb5-3244-4b27-a090-c63685145174","Type":"ContainerStarted","Data":"8f43da0ce35ec501825f0bee6d42dba0168a476cc75484aa0708fca7840b7b97"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.801183 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" event={"ID":"e848ffb5-3244-4b27-a090-c63685145174","Type":"ContainerStarted","Data":"d8722c9444832b602b06ce3e04736ba7e3a5181e22f35bf2d8e73fb51727b566"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.806766 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" event={"ID":"7b2932cf-321f-41d9-b5f7-e9969592da8d","Type":"ContainerStarted","Data":"b0e26cb2a5d4f33fde87f2783c7dc8c5ae59b4ca5631165bbe863883747f97b7"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.806836 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" event={"ID":"7b2932cf-321f-41d9-b5f7-e9969592da8d","Type":"ContainerStarted","Data":"794142837cd5820835b64c6c937c7b85742b80e5ce9d2122e95af711d9e2bd0f"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.822394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" event={"ID":"b30e2325-09e6-41a0-a0ad-67c3e01d2627","Type":"ContainerStarted","Data":"dc916f506e6c4c91a761a8e657e037677ffb3d757f9719e434534ffd41d17a49"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.822466 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" event={"ID":"b30e2325-09e6-41a0-a0ad-67c3e01d2627","Type":"ContainerStarted","Data":"10331ae982b5f958fe9b58c42e082eb3015c7fdfcafa3ebc1991721f7583a6a2"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.830480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" event={"ID":"3189a025-d8d6-42bb-9e59-81c4dc54c5f2","Type":"ContainerStarted","Data":"9b7949697439601ad9d2287b537d213d9c789e7447cdf06633fa6db3b866aa8b"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.830549 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" event={"ID":"3189a025-d8d6-42bb-9e59-81c4dc54c5f2","Type":"ContainerStarted","Data":"8908721b08de6ccd597e2f8ea83f551ab5b26ccc5aec4593eb13d2d40fd3dcda"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.847071 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" event={"ID":"1075512e-3cc0-40fe-938c-07f5baaf964e","Type":"ContainerStarted","Data":"7cc04ac4b1d29c0148af8087546838068000b5c4a116391e23339506b94c3fb1"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.847425 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.849811 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" event={"ID":"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb","Type":"ContainerStarted","Data":"fe188d91835f368041a95608e97353565f75caf92d07da9721c0de1dfd9df87e"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.849895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" event={"ID":"efc8a3e6-dbd6-48a5-a62f-a3b0bef138bb","Type":"ContainerStarted","Data":"1c2c507b9472c3c7422d66db46e985b561034f9cde17e4ad121a6dda575c0a51"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.852076 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-zwnt5" podStartSLOduration=121.852049807 podStartE2EDuration="2m1.852049807s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:04.851272136 +0000 UTC m=+143.841169939" watchObservedRunningTime="2025-09-30 17:20:04.852049807 +0000 UTC m=+143.841947610" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.853477 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.853803 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.353740022 +0000 UTC m=+144.343637965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.854060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.854899 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.354888025 +0000 UTC m=+144.344785828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.865991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-svds2" event={"ID":"4d85180d-59c0-4f8f-8481-170f27db08b6","Type":"ContainerStarted","Data":"6ae15e34c6883160952c88d95ea6f854196bd75a301bd87bb38e6bdb59002828"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.874472 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dq4bf" event={"ID":"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50","Type":"ContainerStarted","Data":"3debd23814b4f1226c4cdcff228ee446efc3cc798c975a9542ca3964b65f8f1d"} Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.897600 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-svds2" podStartSLOduration=121.897577139 podStartE2EDuration="2m1.897577139s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:04.897515427 +0000 UTC m=+143.887413230" watchObservedRunningTime="2025-09-30 17:20:04.897577139 +0000 UTC m=+143.887474942" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.917442 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.957896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.958120 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.458083049 +0000 UTC m=+144.447980852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.958607 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:04 crc kubenswrapper[4778]: E0930 17:20:04.961739 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.461723579 +0000 UTC m=+144.451621382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:04 crc kubenswrapper[4778]: I0930 17:20:04.980502 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.022869 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-vp8fx" podStartSLOduration=123.022821251 podStartE2EDuration="2m3.022821251s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:05.01734302 +0000 UTC m=+144.007240833" watchObservedRunningTime="2025-09-30 17:20:05.022821251 +0000 UTC m=+144.012719054" Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.063633 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.063978 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.563959145 +0000 UTC m=+144.553856948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.164891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.165313 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.665293085 +0000 UTC m=+144.655190888 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.268073 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.268236 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.768213848 +0000 UTC m=+144.758111661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.268895 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.269162 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.769154684 +0000 UTC m=+144.759052487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.341750 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-nprqp" podStartSLOduration=122.341725698 podStartE2EDuration="2m2.341725698s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:05.340692808 +0000 UTC m=+144.330590611" watchObservedRunningTime="2025-09-30 17:20:05.341725698 +0000 UTC m=+144.331623501" Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.342407 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" podStartSLOduration=122.342401504 podStartE2EDuration="2m2.342401504s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:05.304302397 +0000 UTC m=+144.294200200" watchObservedRunningTime="2025-09-30 17:20:05.342401504 +0000 UTC m=+144.332299307" Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.371161 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.371317 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.871284366 +0000 UTC m=+144.861182169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.371736 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.372140 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.872131139 +0000 UTC m=+144.862028942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.476774 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.480324 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:05.980279202 +0000 UTC m=+144.970177005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.581906 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.582540 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.082524358 +0000 UTC m=+145.072422161 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.594014 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:05 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:05 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:05 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.594099 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.685199 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.685886 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.185859916 +0000 UTC m=+145.175757719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.787157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.787730 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.287705477 +0000 UTC m=+145.277603280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.888522 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.888799 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.388766228 +0000 UTC m=+145.378664041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.889116 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.889414 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.389402612 +0000 UTC m=+145.379300405 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.897890 4778 generic.go:334] "Generic (PLEG): container finished" podID="e848ffb5-3244-4b27-a090-c63685145174" containerID="8f43da0ce35ec501825f0bee6d42dba0168a476cc75484aa0708fca7840b7b97" exitCode=0 Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.897986 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" event={"ID":"e848ffb5-3244-4b27-a090-c63685145174","Type":"ContainerDied","Data":"8f43da0ce35ec501825f0bee6d42dba0168a476cc75484aa0708fca7840b7b97"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.898054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" event={"ID":"e848ffb5-3244-4b27-a090-c63685145174","Type":"ContainerStarted","Data":"920cdb97c967ad96a128faf928afeb4774dc3f9c44cafba05a6916915eea7c89"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.900041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" event={"ID":"7b2ed2e0-659c-4371-bb93-039914cfa0b8","Type":"ContainerStarted","Data":"04c9dd4440fabdb38d1d1ec61529fb393287f2bf81bd3725790b01c16d3c4356"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.900109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" event={"ID":"7b2ed2e0-659c-4371-bb93-039914cfa0b8","Type":"ContainerStarted","Data":"b01615e20fbfe214f9f577cdecbddd04438d3ebc23dce84a2795940b566c7ac3"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.905321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" event={"ID":"7b2932cf-321f-41d9-b5f7-e9969592da8d","Type":"ContainerStarted","Data":"c22575fb9769e6f2bd47f7be5b75074d8f23a9550abc8eac16f28af046eceeac"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.909899 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dq4bf" event={"ID":"1f2da339-d7c7-4caa-8eb2-aa2bcd178b50","Type":"ContainerStarted","Data":"c60372d9c98edcd131f831627d1179897286b551a34a236ce9feff16024f7495"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.913931 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-56687" event={"ID":"e6175301-9b87-47be-8c95-a4ce7fa0a413","Type":"ContainerStarted","Data":"95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a"} Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.990052 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229"] Sep 30 17:20:05 crc kubenswrapper[4778]: I0930 17:20:05.990605 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:05 crc kubenswrapper[4778]: E0930 17:20:05.993013 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.49299056 +0000 UTC m=+145.482888363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.016112 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c"] Sep 30 17:20:06 crc kubenswrapper[4778]: W0930 17:20:06.052164 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod535566fd_2a2e_43a4_94cb_dea8d1e2123b.slice/crio-be420535acc7a937d1dc95b340608a503777d0c1a923edd7ff566f98511e6db1 WatchSource:0}: Error finding container be420535acc7a937d1dc95b340608a503777d0c1a923edd7ff566f98511e6db1: Status 404 returned error can't find the container with id be420535acc7a937d1dc95b340608a503777d0c1a923edd7ff566f98511e6db1 Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.055148 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxgcd"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.099998 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.100573 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.600553861 +0000 UTC m=+145.590451664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.130681 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.214788 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.216065 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.716035206 +0000 UTC m=+145.705933009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.236829 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.264008 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-dkph9" podStartSLOduration=123.263984872 podStartE2EDuration="2m3.263984872s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.262385061 +0000 UTC m=+145.252282854" watchObservedRunningTime="2025-09-30 17:20:06.263984872 +0000 UTC m=+145.253882675" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.282909 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.287750 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-td2mw"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.317414 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.317748 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.817734971 +0000 UTC m=+145.807632774 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.321084 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xr85z"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.339097 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" podStartSLOduration=124.339070333 podStartE2EDuration="2m4.339070333s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.30547904 +0000 UTC m=+145.295376843" watchObservedRunningTime="2025-09-30 17:20:06.339070333 +0000 UTC m=+145.328968136" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.343813 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dq4bf" podStartSLOduration=5.343798825 podStartE2EDuration="5.343798825s" podCreationTimestamp="2025-09-30 17:20:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.333108644 +0000 UTC m=+145.323006447" watchObservedRunningTime="2025-09-30 17:20:06.343798825 +0000 UTC m=+145.333696628" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.344379 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-whtc5"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.366556 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kghfs" podStartSLOduration=123.366538341 podStartE2EDuration="2m3.366538341s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.364938489 +0000 UTC m=+145.354836292" watchObservedRunningTime="2025-09-30 17:20:06.366538341 +0000 UTC m=+145.356436144" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.368654 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:06 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:06 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:06 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.369338 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:06 crc kubenswrapper[4778]: W0930 17:20:06.394297 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0de525d2_5fd2_4fd3_9524_3a5505955417.slice/crio-f2fb27599958bf1759aac0149b01bbd1b5e12dcc5046e329c0b9ec896d70660a WatchSource:0}: Error finding container f2fb27599958bf1759aac0149b01bbd1b5e12dcc5046e329c0b9ec896d70660a: Status 404 returned error can't find the container with id f2fb27599958bf1759aac0149b01bbd1b5e12dcc5046e329c0b9ec896d70660a Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.420485 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.421236 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:06.921196015 +0000 UTC m=+145.911093828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.423837 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.522730 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.523101 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.023086568 +0000 UTC m=+146.012984371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.577174 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-sztwq" podStartSLOduration=123.577139128 podStartE2EDuration="2m3.577139128s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.515733124 +0000 UTC m=+145.505630927" watchObservedRunningTime="2025-09-30 17:20:06.577139128 +0000 UTC m=+145.567036931" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.584427 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-95zm8" podStartSLOduration=123.584411999 podStartE2EDuration="2m3.584411999s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.571061164 +0000 UTC m=+145.560958957" watchObservedRunningTime="2025-09-30 17:20:06.584411999 +0000 UTC m=+145.574309802" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.587351 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.602474 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.624740 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.625187 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.125170748 +0000 UTC m=+146.115068541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.668380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.708164 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.741236 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-558p9"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.752024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.755074 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.255058787 +0000 UTC m=+146.244956580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.768819 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.770449 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" podStartSLOduration=123.770421899 podStartE2EDuration="2m3.770421899s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.699165136 +0000 UTC m=+145.689062949" watchObservedRunningTime="2025-09-30 17:20:06.770421899 +0000 UTC m=+145.760319702" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.801679 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-7njlp" podStartSLOduration=123.801648612 podStartE2EDuration="2m3.801648612s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.763323676 +0000 UTC m=+145.753221479" watchObservedRunningTime="2025-09-30 17:20:06.801648612 +0000 UTC m=+145.791546415" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.812731 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.828791 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.837683 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-p7f84"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.851368 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.856857 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.857543 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.357518152 +0000 UTC m=+146.347415955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.865642 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rzzcv"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.873951 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-hm822"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.874123 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-56687" podStartSLOduration=123.874098971 podStartE2EDuration="2m3.874098971s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.838211549 +0000 UTC m=+145.828109352" watchObservedRunningTime="2025-09-30 17:20:06.874098971 +0000 UTC m=+145.863996774" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.878213 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bjv8b"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.883806 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-pl4fp"] Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.929119 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" event={"ID":"f4ed657a-f56f-4237-a9bb-03aad72e437d","Type":"ContainerStarted","Data":"a6f44b6611c9230b210f6ce10787618dcffc2dce1bd7527cef6edaa2efd2e4a6"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.929187 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" event={"ID":"f4ed657a-f56f-4237-a9bb-03aad72e437d","Type":"ContainerStarted","Data":"3f9a89bede62f22547492cad6f4a8e4bc6587ca69a4604876d0addd3f7f30ca4"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.931531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" event={"ID":"113264a3-0486-4bbf-bb1f-1c7494dbfeea","Type":"ContainerStarted","Data":"3eabad68e6340f25a94a1b079dae111d6ff0d83a54f50190e6fc20ff90144799"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.942774 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" event={"ID":"e848ffb5-3244-4b27-a090-c63685145174","Type":"ContainerStarted","Data":"c07a7dc17c9b7f247e3ca628c97cf5352b9605d0f7fcc1c764e400f9819ed9cf"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.962506 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:06 crc kubenswrapper[4778]: E0930 17:20:06.962842 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.462830556 +0000 UTC m=+146.452728359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.964189 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" event={"ID":"c1da094b-f44d-4d9d-9fe8-3d19ea244d09","Type":"ContainerStarted","Data":"79703371c7ff4ccfeee833878d766f87ca0dc88ff8b21e6e4a1438e5d17d8d62"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.964267 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" event={"ID":"c1da094b-f44d-4d9d-9fe8-3d19ea244d09","Type":"ContainerStarted","Data":"58df60e693a043f17da768ad0269c81e943882cb2f7f2185469d95aa9b26d3ba"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.964842 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:06 crc kubenswrapper[4778]: W0930 17:20:06.965825 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06cbc897_4734_4d93_8cfa_75dc0b31cc57.slice/crio-67be5506615f0b462d1aec948e38c3706aa1dad9236a805fc88983e31a1cf0ad WatchSource:0}: Error finding container 67be5506615f0b462d1aec948e38c3706aa1dad9236a805fc88983e31a1cf0ad: Status 404 returned error can't find the container with id 67be5506615f0b462d1aec948e38c3706aa1dad9236a805fc88983e31a1cf0ad Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.969714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" event={"ID":"cc7a228b-7187-412c-987d-696193bfce29","Type":"ContainerStarted","Data":"bc8f3ca96c8bd030cf3fc9d0657b4fe34d86f8d053667f7fe6c6504aa8150eb1"} Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.971132 4778 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wxgcd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.971188 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.978373 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" podStartSLOduration=123.978347444 podStartE2EDuration="2m3.978347444s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.972834261 +0000 UTC m=+145.962732064" watchObservedRunningTime="2025-09-30 17:20:06.978347444 +0000 UTC m=+145.968245247" Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.979313 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pl4fp" event={"ID":"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0","Type":"ContainerStarted","Data":"cab6357fe53574e5e2b96c4711518cd3e474b937d37cb4acd5959356abcc1c2f"} Sep 30 17:20:06 crc kubenswrapper[4778]: W0930 17:20:06.982482 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod385ca6a0_940d_409a_a0aa_b22ab8920177.slice/crio-db5dff8fc4b4e3c8e77b18a638ba0d5c88666f3faec6c3fac9fda17c68c7c412 WatchSource:0}: Error finding container db5dff8fc4b4e3c8e77b18a638ba0d5c88666f3faec6c3fac9fda17c68c7c412: Status 404 returned error can't find the container with id db5dff8fc4b4e3c8e77b18a638ba0d5c88666f3faec6c3fac9fda17c68c7c412 Sep 30 17:20:06 crc kubenswrapper[4778]: I0930 17:20:06.994774 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" podStartSLOduration=124.994747236 podStartE2EDuration="2m4.994747236s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:06.994125931 +0000 UTC m=+145.984023744" watchObservedRunningTime="2025-09-30 17:20:06.994747236 +0000 UTC m=+145.984645039" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.001342 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" event={"ID":"a782c417-1524-4002-9d72-62747e465267","Type":"ContainerStarted","Data":"8d8f2d7738de53dcc692e10a369932ff4d25f2f0c0bdc66fd9ebd9625e4ed26e"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.003394 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" event={"ID":"299ca171-a2bb-4da3-9630-31d12f416ae9","Type":"ContainerStarted","Data":"e2bb6c547109e88c5c5510093e2ef3dfa69f321e0f947a5277c13b340c1598ee"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.009172 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" event={"ID":"89d4ed97-8e17-4eea-9774-19034031f4dc","Type":"ContainerStarted","Data":"ed291fb05c8b0dcd5feea17be0169efbe65de231cffd872ec587c7ad8d17ebe2"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.014860 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" event={"ID":"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f","Type":"ContainerStarted","Data":"3960baa070714fe8bd0f02b56efeede25a251f63c4ebfe2c854b8f20e5e60ed8"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.016443 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" event={"ID":"ee173b5a-1b04-4447-ac5e-42d11cc5e80c","Type":"ContainerStarted","Data":"3f87c02d8c26bbeffde886dd8a3ffb19dfb400120a99c759952756e7d7479527"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.023415 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" event={"ID":"e054b3bf-2796-4649-a9a0-8d19ea90412b","Type":"ContainerStarted","Data":"5143b2ae4bbb295b852ee31d32bcf39f645d07e41c088709f3d7af6a3366f2f2"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.023466 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" event={"ID":"e054b3bf-2796-4649-a9a0-8d19ea90412b","Type":"ContainerStarted","Data":"a4690f42c9131f6278591463802ff0746ef3420de94f840d044aa789f73f283e"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.033332 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-558p9" event={"ID":"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3","Type":"ContainerStarted","Data":"96c2050569cd5c842dd2e3c705f68c08580dd66ae7447e10f4a3cf5da2a555a5"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.046344 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ff94r" podStartSLOduration=124.045208388 podStartE2EDuration="2m4.045208388s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:07.039740257 +0000 UTC m=+146.029638080" watchObservedRunningTime="2025-09-30 17:20:07.045208388 +0000 UTC m=+146.035106191" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.047230 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" event={"ID":"0de525d2-5fd2-4fd3-9524-3a5505955417","Type":"ContainerStarted","Data":"a3f4f66d8dfefcf782b9a0a4cdd320c1353d4efdb14447db6100add5631e1302"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.047289 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" event={"ID":"0de525d2-5fd2-4fd3-9524-3a5505955417","Type":"ContainerStarted","Data":"f2fb27599958bf1759aac0149b01bbd1b5e12dcc5046e329c0b9ec896d70660a"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.054233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" event={"ID":"6b13f665-8a05-4466-a03b-511d2456f1ca","Type":"ContainerStarted","Data":"0f62eecbd7c9304290fcd69fab7cb03bd07917ebd9ae0658f9c1f980288172f6"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.058122 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" event={"ID":"45e7824d-5ef7-4454-bdb7-8a02b52c6c45","Type":"ContainerStarted","Data":"da17ccf7e0b6af83351515fb8a66d94ec0752ee5f41bc5efd25917851e9e3a00"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.061245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" event={"ID":"ce48deed-422c-411b-946d-30a87d293815","Type":"ContainerStarted","Data":"deb5b41c8ac0e54e2689acf52f588ff0889eddfc32f6796a47af910f8c82f48a"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.061296 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" event={"ID":"ce48deed-422c-411b-946d-30a87d293815","Type":"ContainerStarted","Data":"4ab347bd9570243948510013fdd1318c32723b7b85e7922465622cb2b39b3217"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.063163 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.063286 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.563263163 +0000 UTC m=+146.553160956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.063443 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.064589 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.564575954 +0000 UTC m=+146.554473757 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.064776 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" event={"ID":"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c","Type":"ContainerStarted","Data":"2071394ecd00bc17c0395fc50f0489f39d35be13618a7fc1e76539ec837cfdbe"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.064822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" event={"ID":"9653d8ee-00bd-4f1a-ae33-b7b9fc0edf0c","Type":"ContainerStarted","Data":"f3d6c03415f88f04c7ed09330e1c6080818bc116ff015a627ae283a0cdb3423b"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.067080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" event={"ID":"535566fd-2a2e-43a4-94cb-dea8d1e2123b","Type":"ContainerStarted","Data":"b219f43da38348e2df202028bf2ab09060d1fa63e6dfc7a8fe4c44d947aa3369"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.067115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" event={"ID":"535566fd-2a2e-43a4-94cb-dea8d1e2123b","Type":"ContainerStarted","Data":"be420535acc7a937d1dc95b340608a503777d0c1a923edd7ff566f98511e6db1"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.072658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" event={"ID":"5b85a8ab-2235-4f6f-826f-22f354550dfb","Type":"ContainerStarted","Data":"966b6c82eb0f1cc1c232947210ca9165cefb296932537de1b6099e4c4dbba57e"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.085402 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xr85z" event={"ID":"22eb8492-05be-4e05-a4b8-34f965c014ed","Type":"ContainerStarted","Data":"8760a4f7aec6b50f659da33cf9567f606a82e59879ddbd0f5485003766a5e95c"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.085456 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.085469 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xr85z" event={"ID":"22eb8492-05be-4e05-a4b8-34f965c014ed","Type":"ContainerStarted","Data":"301f984cb82311ef534ef0f962930ecdbdda0c49ed0fe6c14625cce1802d1fb3"} Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.086540 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.096283 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-xr85z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.096354 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xr85z" podUID="22eb8492-05be-4e05-a4b8-34f965c014ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.111095 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-b8229" podStartSLOduration=124.111075623 podStartE2EDuration="2m4.111075623s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:07.080498186 +0000 UTC m=+146.070396009" watchObservedRunningTime="2025-09-30 17:20:07.111075623 +0000 UTC m=+146.100973426" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.133162 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-w8zpd" podStartSLOduration=124.133140513 podStartE2EDuration="2m4.133140513s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:07.111804962 +0000 UTC m=+146.101702785" watchObservedRunningTime="2025-09-30 17:20:07.133140513 +0000 UTC m=+146.123038316" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.165680 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" podStartSLOduration=125.165660395 podStartE2EDuration="2m5.165660395s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:07.133391423 +0000 UTC m=+146.123289236" watchObservedRunningTime="2025-09-30 17:20:07.165660395 +0000 UTC m=+146.155558198" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.169347 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xr85z" podStartSLOduration=125.169338047 podStartE2EDuration="2m5.169338047s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:07.16918852 +0000 UTC m=+146.159086323" watchObservedRunningTime="2025-09-30 17:20:07.169338047 +0000 UTC m=+146.159235850" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.169582 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.169773 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.669741402 +0000 UTC m=+146.659639205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.169988 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.170909 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.670899357 +0000 UTC m=+146.660797150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.199259 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.271352 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.273765 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.773725185 +0000 UTC m=+146.763623128 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.320029 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.320105 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.327526 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.361857 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:07 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:07 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:07 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.361913 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.376140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.376648 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.876607506 +0000 UTC m=+146.866505499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.477025 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.477311 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.977268531 +0000 UTC m=+146.967166334 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.477945 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.479189 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:07.979158844 +0000 UTC m=+146.969056857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.580359 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.580751 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.080707753 +0000 UTC m=+147.070605556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.682301 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.682888 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.182862146 +0000 UTC m=+147.172759949 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.783768 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.783921 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.283900715 +0000 UTC m=+147.273798508 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.784388 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.784772 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.284764869 +0000 UTC m=+147.274662672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.885992 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.886173 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.386151612 +0000 UTC m=+147.376049415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.886317 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.886700 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.386688933 +0000 UTC m=+147.376586746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.987150 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.987409 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.487378849 +0000 UTC m=+147.477276652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:07 crc kubenswrapper[4778]: I0930 17:20:07.987524 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:07 crc kubenswrapper[4778]: E0930 17:20:07.987903 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.487889479 +0000 UTC m=+147.477787282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.088956 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.089095 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.589063394 +0000 UTC m=+147.578961197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.089137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.089669 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.589647356 +0000 UTC m=+147.579545149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.105906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" event={"ID":"5b85a8ab-2235-4f6f-826f-22f354550dfb","Type":"ContainerStarted","Data":"8c9b67ce257c70d016dbdc3c9ef0d64ca42b017fd31b374ea17268375125997a"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.107698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" event={"ID":"45e7824d-5ef7-4454-bdb7-8a02b52c6c45","Type":"ContainerStarted","Data":"80721f02ad9092bd96c109b3cf62ff68013af0b97147db35db6405dfc031688c"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.109064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" event={"ID":"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb","Type":"ContainerStarted","Data":"c6b77c940370c53120fcd36ea94ca30e50b5acbe06c95544a4d24f17154bfdd3"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.110969 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" event={"ID":"89d4ed97-8e17-4eea-9774-19034031f4dc","Type":"ContainerStarted","Data":"88f17613b6f95084d5475bd253c6f60d0ca78623b63397cd4c9343dcc4d932bc"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.113918 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hm822" event={"ID":"5f326088-a1d0-43ee-9b2a-41e7ac797679","Type":"ContainerStarted","Data":"f4812dcda297d97e4507799bf0b735bed62fe1e71665f538c390a6447b4a9962"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.114847 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rzzcv" event={"ID":"06cbc897-4734-4d93-8cfa-75dc0b31cc57","Type":"ContainerStarted","Data":"67be5506615f0b462d1aec948e38c3706aa1dad9236a805fc88983e31a1cf0ad"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.116017 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" event={"ID":"385ca6a0-940d-409a-a0aa-b22ab8920177","Type":"ContainerStarted","Data":"db5dff8fc4b4e3c8e77b18a638ba0d5c88666f3faec6c3fac9fda17c68c7c412"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.118558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" event={"ID":"a782c417-1524-4002-9d72-62747e465267","Type":"ContainerStarted","Data":"a38c415bf15291ab88a1da4c9cd59d8fb253c4eb30fad0fd8f038f543ffda9d8"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.121542 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" event={"ID":"f4ed657a-f56f-4237-a9bb-03aad72e437d","Type":"ContainerStarted","Data":"8c3a5e72fd389171c49b6976a4ef7faa69a22ab4e288fadefc5e957bed683046"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.123050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" event={"ID":"299ca171-a2bb-4da3-9630-31d12f416ae9","Type":"ContainerStarted","Data":"d1deb7e44bae0d827e527c977fb92bd14b9ba6502b7789f1b657970107345580"} Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.124000 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-xr85z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.124047 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xr85z" podUID="22eb8492-05be-4e05-a4b8-34f965c014ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.124830 4778 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-wxgcd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.124878 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.129927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-pkzrb" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.132506 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6vxnc" podStartSLOduration=125.132479205 podStartE2EDuration="2m5.132479205s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:08.129344164 +0000 UTC m=+147.119241967" watchObservedRunningTime="2025-09-30 17:20:08.132479205 +0000 UTC m=+147.122377018" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.155210 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-p6wqv" podStartSLOduration=125.155190529 podStartE2EDuration="2m5.155190529s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:08.153765965 +0000 UTC m=+147.143663778" watchObservedRunningTime="2025-09-30 17:20:08.155190529 +0000 UTC m=+147.145088332" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.190606 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.190790 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.690759279 +0000 UTC m=+147.680657082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.191442 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.192157 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.692128381 +0000 UTC m=+147.682026364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.293212 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.293670 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.793602227 +0000 UTC m=+147.783500040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.294246 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.296324 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.796312242 +0000 UTC m=+147.786210045 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.317093 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k755p" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.366546 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:08 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:08 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:08 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.366604 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.395638 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.396020 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:08.89600123 +0000 UTC m=+147.885899033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.497746 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.500093 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.000069237 +0000 UTC m=+147.989967050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.599670 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.601287 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.101242621 +0000 UTC m=+148.091140424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.606700 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.610136 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.702904 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.703341 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.203323611 +0000 UTC m=+148.193221414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.803702 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.803829 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.303797109 +0000 UTC m=+148.293694932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.803956 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.804260 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.304247006 +0000 UTC m=+148.294144879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.845108 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.905555 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.905736 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.405711962 +0000 UTC m=+148.395609765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:08 crc kubenswrapper[4778]: I0930 17:20:08.905965 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:08 crc kubenswrapper[4778]: E0930 17:20:08.906398 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.406381808 +0000 UTC m=+148.396279661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.007111 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.007311 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.507281902 +0000 UTC m=+148.497179705 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.007412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.007762 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.507753251 +0000 UTC m=+148.497651054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.109217 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.109450 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.609414795 +0000 UTC m=+148.599312588 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.109740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.110122 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.610107731 +0000 UTC m=+148.600005534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.130294 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" event={"ID":"cc7a228b-7187-412c-987d-696193bfce29","Type":"ContainerStarted","Data":"027a74018911e80941611e4475c5ca32c3091a950cba7a8ab0d7f19e6a4d9f5f"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.130345 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" event={"ID":"cc7a228b-7187-412c-987d-696193bfce29","Type":"ContainerStarted","Data":"0b62a0939174bba86d2c9391d3bc4d177fde97e28155af27f8afa8ad4150374c"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.133229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" event={"ID":"ee173b5a-1b04-4447-ac5e-42d11cc5e80c","Type":"ContainerStarted","Data":"1fe9121a6132733f90b6679676a4838af7fb182d92dc1d8827885ab200ae9fa0"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.133290 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" event={"ID":"ee173b5a-1b04-4447-ac5e-42d11cc5e80c","Type":"ContainerStarted","Data":"2cb621b8acd649751400b7c667527b9bc1ec7d31bc34f3822c4c23074caa542c"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.134697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-hm822" event={"ID":"5f326088-a1d0-43ee-9b2a-41e7ac797679","Type":"ContainerStarted","Data":"89a57b43473158945ac519566c9fb1a6c5e635ec8ed295c8a8893c0917f5093e"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.135451 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.137556 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-hm822 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.137591 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hm822" podUID="5f326088-a1d0-43ee-9b2a-41e7ac797679" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.138989 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-558p9" event={"ID":"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3","Type":"ContainerStarted","Data":"1de77f94ad0409271613b5e5d0a804d4ed10281113aa4a68aeb2b9afb31418ba"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.139018 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-558p9" event={"ID":"2fbba6ff-d43a-4c90-aa8e-78d54f7f36f3","Type":"ContainerStarted","Data":"3de27a85d81db62343308f729f194c5cc9a5243241841e7d69fe87365fedaa30"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.139551 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-558p9" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.140942 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" event={"ID":"45e7824d-5ef7-4454-bdb7-8a02b52c6c45","Type":"ContainerStarted","Data":"ce0d7c4a253c7c88fb8ba2572a6c54a69fa0ae9be2031537c5dca057a2602233"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.142944 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rzzcv" event={"ID":"06cbc897-4734-4d93-8cfa-75dc0b31cc57","Type":"ContainerStarted","Data":"effc6aa66504cf9e5da90c877c65dbc1c9439fb43731ef03deeaf4f9b5505568"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.144196 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" event={"ID":"ca1cd7fe-7a33-4516-9bf8-c45f031f5c5f","Type":"ContainerStarted","Data":"cf268eccb04392f2395911f6187e80dc09beb2e4c15526f4fd1c3270849532f0"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.145341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" event={"ID":"385ca6a0-940d-409a-a0aa-b22ab8920177","Type":"ContainerStarted","Data":"0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.146071 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.147660 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bjv8b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.147728 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.148143 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" event={"ID":"5b85a8ab-2235-4f6f-826f-22f354550dfb","Type":"ContainerStarted","Data":"07e8a8d0a7277da51d121ea5f5b4ae000d7f8261f256f192e127df5909a0dc36"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.149793 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pl4fp" event={"ID":"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0","Type":"ContainerStarted","Data":"0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.150889 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" event={"ID":"6b13f665-8a05-4466-a03b-511d2456f1ca","Type":"ContainerStarted","Data":"eab4333607870ac91a09d86759d7b7d26787ac95e9a436e18072dd10a5b49738"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.151867 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" event={"ID":"113264a3-0486-4bbf-bb1f-1c7494dbfeea","Type":"ContainerStarted","Data":"fe9a4eb50eb557d60e2926dc66e61250f1bcdb007cb4d00d95c37f9a3119fba5"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.152443 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.153744 4778 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-sr9l8 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.153783 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" podUID="113264a3-0486-4bbf-bb1f-1c7494dbfeea" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.154275 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" event={"ID":"51bf38ac-54b4-4f70-b3bf-14a2fef26dcb","Type":"ContainerStarted","Data":"96f2ed453390b137f77f876806ade3caf24007267ac9c38ed47be1d8f2b97e3f"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.154492 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.156852 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" event={"ID":"0de525d2-5fd2-4fd3-9524-3a5505955417","Type":"ContainerStarted","Data":"c07fb2bbac0612d8df1bd764c71a7bcf68a57f87c9b6428982fc8a4a804537ed"} Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.157263 4778 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fn7v9 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.157300 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" podUID="51bf38ac-54b4-4f70-b3bf-14a2fef26dcb" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.168448 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-dvxvh" podStartSLOduration=126.168432667 podStartE2EDuration="2m6.168432667s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.166926849 +0000 UTC m=+148.156824652" watchObservedRunningTime="2025-09-30 17:20:09.168432667 +0000 UTC m=+148.158330470" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.200894 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" podStartSLOduration=126.200869505 podStartE2EDuration="2m6.200869505s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.200322114 +0000 UTC m=+148.190219937" watchObservedRunningTime="2025-09-30 17:20:09.200869505 +0000 UTC m=+148.190767308" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.210679 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.212004 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.711987044 +0000 UTC m=+148.701884847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.233202 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-558p9" podStartSLOduration=9.233179929 podStartE2EDuration="9.233179929s" podCreationTimestamp="2025-09-30 17:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.231056107 +0000 UTC m=+148.220953910" watchObservedRunningTime="2025-09-30 17:20:09.233179929 +0000 UTC m=+148.223077732" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.258490 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-rfdt6" podStartSLOduration=126.258470773 podStartE2EDuration="2m6.258470773s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.254593374 +0000 UTC m=+148.244491187" watchObservedRunningTime="2025-09-30 17:20:09.258470773 +0000 UTC m=+148.248368576" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.312093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.316348 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.81632918 +0000 UTC m=+148.806227073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.326651 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" podStartSLOduration=126.326626757 podStartE2EDuration="2m6.326626757s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.301330643 +0000 UTC m=+148.291228446" watchObservedRunningTime="2025-09-30 17:20:09.326626757 +0000 UTC m=+148.316524560" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.331407 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-p7f84" podStartSLOduration=126.3313897 podStartE2EDuration="2m6.3313897s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.325020425 +0000 UTC m=+148.314918228" watchObservedRunningTime="2025-09-30 17:20:09.3313897 +0000 UTC m=+148.321287503" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.363427 4778 patch_prober.go:28] interesting pod/apiserver-76f77b778f-b72ms container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]log ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]etcd ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/generic-apiserver-start-informers ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/max-in-flight-filter ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/image.openshift.io-apiserver-caches ok Sep 30 17:20:09 crc kubenswrapper[4778]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Sep 30 17:20:09 crc kubenswrapper[4778]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/project.openshift.io-projectcache ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-startinformers ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/openshift.io-restmapperupdater ok Sep 30 17:20:09 crc kubenswrapper[4778]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Sep 30 17:20:09 crc kubenswrapper[4778]: livez check failed Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.363487 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" podUID="e848ffb5-3244-4b27-a090-c63685145174" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.369376 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:09 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:09 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:09 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.369496 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.378558 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-pl4fp" podStartSLOduration=127.378535705 podStartE2EDuration="2m7.378535705s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.377527106 +0000 UTC m=+148.367424909" watchObservedRunningTime="2025-09-30 17:20:09.378535705 +0000 UTC m=+148.368433508" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.403787 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" podStartSLOduration=126.403754496 podStartE2EDuration="2m6.403754496s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.401793101 +0000 UTC m=+148.391690904" watchObservedRunningTime="2025-09-30 17:20:09.403754496 +0000 UTC m=+148.393652299" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.414047 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.414662 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:09.914643015 +0000 UTC m=+148.904540818 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.424982 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" podStartSLOduration=126.424955202 podStartE2EDuration="2m6.424955202s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.422129863 +0000 UTC m=+148.412027666" watchObservedRunningTime="2025-09-30 17:20:09.424955202 +0000 UTC m=+148.414853005" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.466319 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" podStartSLOduration=126.466300304 podStartE2EDuration="2m6.466300304s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.447595644 +0000 UTC m=+148.437493447" watchObservedRunningTime="2025-09-30 17:20:09.466300304 +0000 UTC m=+148.456198107" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.508551 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-hm822" podStartSLOduration=127.508479517 podStartE2EDuration="2m7.508479517s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.507788331 +0000 UTC m=+148.497686144" watchObservedRunningTime="2025-09-30 17:20:09.508479517 +0000 UTC m=+148.498377320" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.510016 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rzzcv" podStartSLOduration=9.510008767 podStartE2EDuration="9.510008767s" podCreationTimestamp="2025-09-30 17:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.467700437 +0000 UTC m=+148.457598240" watchObservedRunningTime="2025-09-30 17:20:09.510008767 +0000 UTC m=+148.499906570" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.516232 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.516653 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.016632381 +0000 UTC m=+149.006530184 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.538540 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-whtc5" podStartSLOduration=126.538520134 podStartE2EDuration="2m6.538520134s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.53711779 +0000 UTC m=+148.527015593" watchObservedRunningTime="2025-09-30 17:20:09.538520134 +0000 UTC m=+148.528417937" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.605589 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nz4ww" podStartSLOduration=126.605559515 podStartE2EDuration="2m6.605559515s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.561371454 +0000 UTC m=+148.551269257" watchObservedRunningTime="2025-09-30 17:20:09.605559515 +0000 UTC m=+148.595457318" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.617770 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.618056 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.118011444 +0000 UTC m=+149.107909257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.618420 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.618840 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.118825155 +0000 UTC m=+149.108722958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.719602 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.719802 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.219763662 +0000 UTC m=+149.209661455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.719947 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.720009 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.720040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.720064 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.720089 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.720409 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.220401716 +0000 UTC m=+149.210299519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.725636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.728583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.728679 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.744191 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.821341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.821745 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.321588621 +0000 UTC m=+149.311486424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.822255 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.822826 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.322818469 +0000 UTC m=+149.312716272 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.831043 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.924669 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.925126 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.425090806 +0000 UTC m=+149.414988609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.925262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:09 crc kubenswrapper[4778]: E0930 17:20:09.925828 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.425810983 +0000 UTC m=+149.415708786 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.947349 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 17:20:09 crc kubenswrapper[4778]: I0930 17:20:09.976296 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.026523 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.027119 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.527095863 +0000 UTC m=+149.516993666 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.131083 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.131685 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.631664689 +0000 UTC m=+149.621562502 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.178906 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-hm822 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.179555 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hm822" podUID="5f326088-a1d0-43ee-9b2a-41e7ac797679" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.178951 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bjv8b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.179809 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.206876 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fn7v9" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.214993 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-sr9l8" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.235567 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.236450 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.736424102 +0000 UTC m=+149.726321905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.339425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.353768 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.853744728 +0000 UTC m=+149.843642621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.377175 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nfjrg" podStartSLOduration=127.377151279 podStartE2EDuration="2m7.377151279s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:09.606876856 +0000 UTC m=+148.596774649" watchObservedRunningTime="2025-09-30 17:20:10.377151279 +0000 UTC m=+149.367049082" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.382234 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:10 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:10 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:10 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.382310 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.444273 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.444577 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:10.944560474 +0000 UTC m=+149.934458267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.549497 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.549904 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.049889139 +0000 UTC m=+150.039786942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.653844 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.655014 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.154986026 +0000 UTC m=+150.144883829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.760556 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.760967 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.260953795 +0000 UTC m=+150.250851598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.861356 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.861739 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.361721604 +0000 UTC m=+150.351619407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:10 crc kubenswrapper[4778]: I0930 17:20:10.963869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:10 crc kubenswrapper[4778]: E0930 17:20:10.964207 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.464194029 +0000 UTC m=+150.454091832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.064844 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.065726 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.565703017 +0000 UTC m=+150.555600810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.167493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.167981 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.667963304 +0000 UTC m=+150.657861117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.212050 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"c22af6810114238145b5f90646b3d4c264d0004865decb48d7126fe5abc90be4"} Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.212135 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e41fdb75dc8d6fedf98a141a3098c18d48f40e13a5acce442124b8500005c3b4"} Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.222442 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"72a5372d4d88f2b2d3aa6a7117eaaa300ecdad22ecced9ebf1d994a71eb34069"} Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.230702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"97b1b9ce6e2433111b93bf27626c399728091347e155987cd8aa666ed7d80e32"} Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.270606 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" event={"ID":"6b13f665-8a05-4466-a03b-511d2456f1ca","Type":"ContainerStarted","Data":"279ab85df6b87e8d4d22be84bd1a77961ad5a4693659fcfeea9a192a5fea65a7"} Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.272219 4778 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-bjv8b container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.272277 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.26:8080/healthz\": dial tcp 10.217.0.26:8080: connect: connection refused" Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.272300 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.272812 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.772783089 +0000 UTC m=+150.762680892 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.371769 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:11 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:11 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:11 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.371840 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.374754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.375580 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.875560746 +0000 UTC m=+150.865458549 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.476375 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.476642 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.976584925 +0000 UTC m=+150.966482758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.477163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.477518 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:11.97750619 +0000 UTC m=+150.967403983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.530872 4778 patch_prober.go:28] interesting pod/console-operator-58897d9998-hm822 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 17:20:11 crc kubenswrapper[4778]: [+]log ok Sep 30 17:20:11 crc kubenswrapper[4778]: [+]poststarthook/max-in-flight-filter ok Sep 30 17:20:11 crc kubenswrapper[4778]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Sep 30 17:20:11 crc kubenswrapper[4778]: [+]shutdown ok Sep 30 17:20:11 crc kubenswrapper[4778]: readyz check failed Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.530930 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-hm822" podUID="5f326088-a1d0-43ee-9b2a-41e7ac797679" containerName="console-operator" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.578190 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.578575 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:12.07855836 +0000 UTC m=+151.068456163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.679411 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.679825 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:12.179807658 +0000 UTC m=+151.169705461 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.780119 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.780352 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:12.280318528 +0000 UTC m=+151.270216331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.780409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.780854 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 17:20:12.280837357 +0000 UTC m=+151.270735160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-dtqsp" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.803813 4778 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.872955 4778 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T17:20:11.803842633Z","Handler":null,"Name":""} Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.881871 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:11 crc kubenswrapper[4778]: E0930 17:20:11.882257 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 17:20:12.382239852 +0000 UTC m=+151.372137655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.897866 4778 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.897907 4778 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 17:20:11 crc kubenswrapper[4778]: I0930 17:20:11.984124 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.010238 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcbgk"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.013113 4778 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.013171 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.032939 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.044047 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.050026 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcbgk"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.171711 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wfx99"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.173195 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.175978 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.187303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-catalog-content\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.187350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wj5\" (UniqueName: \"kubernetes.io/projected/e159da09-2a9f-4472-acca-abe0193feb9f-kube-api-access-p7wj5\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.187385 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-utilities\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.204818 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wfx99"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.269001 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-dtqsp\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.288514 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.288861 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dgmf\" (UniqueName: \"kubernetes.io/projected/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-kube-api-access-5dgmf\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.289327 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-utilities\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.289393 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-catalog-content\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.289422 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wj5\" (UniqueName: \"kubernetes.io/projected/e159da09-2a9f-4472-acca-abe0193feb9f-kube-api-access-p7wj5\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.289457 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-catalog-content\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.289488 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-utilities\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.290030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-catalog-content\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.290468 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-utilities\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.294294 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" event={"ID":"6b13f665-8a05-4466-a03b-511d2456f1ca","Type":"ContainerStarted","Data":"026c5df640be803a720b56d8669132f18515eff53d531cb03161eebf4aac0634"} Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.294342 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" event={"ID":"6b13f665-8a05-4466-a03b-511d2456f1ca","Type":"ContainerStarted","Data":"dedb9130c0e6938cd3578201549669178d469492d303b62fcb3e21ee105c6d0a"} Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.298531 4778 generic.go:334] "Generic (PLEG): container finished" podID="535566fd-2a2e-43a4-94cb-dea8d1e2123b" containerID="b219f43da38348e2df202028bf2ab09060d1fa63e6dfc7a8fe4c44d947aa3369" exitCode=0 Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.298611 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" event={"ID":"535566fd-2a2e-43a4-94cb-dea8d1e2123b","Type":"ContainerDied","Data":"b219f43da38348e2df202028bf2ab09060d1fa63e6dfc7a8fe4c44d947aa3369"} Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.299921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2009f51d67f9a7b2d5100c48890bc6b4c7104ac7fd33de45f493627dacc1449c"} Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.300188 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.301426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fa5444283f86fc70966769a5306e0519f0a3c10dafafe39c6243bffc01d5964b"} Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.319685 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wj5\" (UniqueName: \"kubernetes.io/projected/e159da09-2a9f-4472-acca-abe0193feb9f-kube-api-access-p7wj5\") pod \"community-operators-bcbgk\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.320236 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.333863 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-td2mw" podStartSLOduration=12.333839197 podStartE2EDuration="12.333839197s" podCreationTimestamp="2025-09-30 17:20:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:12.329932366 +0000 UTC m=+151.319830169" watchObservedRunningTime="2025-09-30 17:20:12.333839197 +0000 UTC m=+151.323737000" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.364338 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.367220 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rhbnw"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.368411 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.370209 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.375763 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:12 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:12 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:12 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.375829 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.388017 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhbnw"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.390426 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dgmf\" (UniqueName: \"kubernetes.io/projected/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-kube-api-access-5dgmf\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.390518 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-utilities\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.390551 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-catalog-content\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.390967 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-catalog-content\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.392809 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-utilities\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.422000 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dgmf\" (UniqueName: \"kubernetes.io/projected/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-kube-api-access-5dgmf\") pod \"certified-operators-wfx99\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.441839 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.442636 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.447139 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.447314 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.459221 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.488864 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.491512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-utilities\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.491584 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9zg\" (UniqueName: \"kubernetes.io/projected/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-kube-api-access-7w9zg\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.491683 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-catalog-content\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.573507 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lbvnt"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.574554 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597020 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-catalog-content\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597403 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lbvnt"] Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597450 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-utilities\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597519 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597557 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9zg\" (UniqueName: \"kubernetes.io/projected/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-kube-api-access-7w9zg\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.597970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-utilities\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.598436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-catalog-content\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.635751 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9zg\" (UniqueName: \"kubernetes.io/projected/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-kube-api-access-7w9zg\") pod \"community-operators-rhbnw\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.683287 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.699058 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k28cx\" (UniqueName: \"kubernetes.io/projected/2654d449-8673-41a9-b2a9-c5f986819740-kube-api-access-k28cx\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.699139 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.699199 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-catalog-content\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.699228 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.699283 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.699544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-utilities\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.720400 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.779690 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.803848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k28cx\" (UniqueName: \"kubernetes.io/projected/2654d449-8673-41a9-b2a9-c5f986819740-kube-api-access-k28cx\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.803993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-catalog-content\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.804032 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-utilities\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.805253 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-utilities\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.807149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-catalog-content\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.840944 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k28cx\" (UniqueName: \"kubernetes.io/projected/2654d449-8673-41a9-b2a9-c5f986819740-kube-api-access-k28cx\") pod \"certified-operators-lbvnt\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:12 crc kubenswrapper[4778]: I0930 17:20:12.922604 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.081963 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wfx99"] Sep 30 17:20:13 crc kubenswrapper[4778]: W0930 17:20:13.104387 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fabd2e3_f24a_411f_a8a6_ce455ddd6d9b.slice/crio-ff5e0e93ad3eac76d9b1585771c1a254b05cc29027001a9a3187dd82f0af6ff7 WatchSource:0}: Error finding container ff5e0e93ad3eac76d9b1585771c1a254b05cc29027001a9a3187dd82f0af6ff7: Status 404 returned error can't find the container with id ff5e0e93ad3eac76d9b1585771c1a254b05cc29027001a9a3187dd82f0af6ff7 Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.162379 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rhbnw"] Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.174697 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcbgk"] Sep 30 17:20:13 crc kubenswrapper[4778]: W0930 17:20:13.177531 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode90450b5_cbd1_44fd_9fd6_7def6ea75b33.slice/crio-4fe2395e32335a557e1b789dedc4ed922ff912fd81500b39ae594dc0e5f13647 WatchSource:0}: Error finding container 4fe2395e32335a557e1b789dedc4ed922ff912fd81500b39ae594dc0e5f13647: Status 404 returned error can't find the container with id 4fe2395e32335a557e1b789dedc4ed922ff912fd81500b39ae594dc0e5f13647 Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.178358 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dtqsp"] Sep 30 17:20:13 crc kubenswrapper[4778]: W0930 17:20:13.190608 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode159da09_2a9f_4472_acca_abe0193feb9f.slice/crio-9cb1372030befbca7b84f8ec04c93d6501310fe27f7437275c985ab7013b43f9 WatchSource:0}: Error finding container 9cb1372030befbca7b84f8ec04c93d6501310fe27f7437275c985ab7013b43f9: Status 404 returned error can't find the container with id 9cb1372030befbca7b84f8ec04c93d6501310fe27f7437275c985ab7013b43f9 Sep 30 17:20:13 crc kubenswrapper[4778]: W0930 17:20:13.198834 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452a4879_b0bd_490e_bffb_25f8404a6eac.slice/crio-4039dddd8e26351159f4a9395e62f5cbecc91b6b066c5e638f58ce408c37e5dc WatchSource:0}: Error finding container 4039dddd8e26351159f4a9395e62f5cbecc91b6b066c5e638f58ce408c37e5dc: Status 404 returned error can't find the container with id 4039dddd8e26351159f4a9395e62f5cbecc91b6b066c5e638f58ce408c37e5dc Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.254590 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.316233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerStarted","Data":"ff5e0e93ad3eac76d9b1585771c1a254b05cc29027001a9a3187dd82f0af6ff7"} Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.318091 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c","Type":"ContainerStarted","Data":"5246a3e014f554dad48236b613bfa230c6ef40b0d6eb0ebf921338e2627a2caa"} Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.330481 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerStarted","Data":"9cb1372030befbca7b84f8ec04c93d6501310fe27f7437275c985ab7013b43f9"} Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.331208 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lbvnt"] Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.332048 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" event={"ID":"452a4879-b0bd-490e-bffb-25f8404a6eac","Type":"ContainerStarted","Data":"4039dddd8e26351159f4a9395e62f5cbecc91b6b066c5e638f58ce408c37e5dc"} Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.333867 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhbnw" event={"ID":"e90450b5-cbd1-44fd-9fd6-7def6ea75b33","Type":"ContainerStarted","Data":"4fe2395e32335a557e1b789dedc4ed922ff912fd81500b39ae594dc0e5f13647"} Sep 30 17:20:13 crc kubenswrapper[4778]: W0930 17:20:13.342264 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2654d449_8673_41a9_b2a9_c5f986819740.slice/crio-81f89580731a50a28f41f8c3a9c69b99252f22d109f20cab6bc4db628a369050 WatchSource:0}: Error finding container 81f89580731a50a28f41f8c3a9c69b99252f22d109f20cab6bc4db628a369050: Status 404 returned error can't find the container with id 81f89580731a50a28f41f8c3a9c69b99252f22d109f20cab6bc4db628a369050 Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.357935 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.362888 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:13 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:13 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:13 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.362963 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.544252 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.616706 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535566fd-2a2e-43a4-94cb-dea8d1e2123b-secret-volume\") pod \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.616767 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkmt\" (UniqueName: \"kubernetes.io/projected/535566fd-2a2e-43a4-94cb-dea8d1e2123b-kube-api-access-7tkmt\") pod \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.616815 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535566fd-2a2e-43a4-94cb-dea8d1e2123b-config-volume\") pod \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\" (UID: \"535566fd-2a2e-43a4-94cb-dea8d1e2123b\") " Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.619326 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.619444 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/535566fd-2a2e-43a4-94cb-dea8d1e2123b-config-volume" (OuterVolumeSpecName: "config-volume") pod "535566fd-2a2e-43a4-94cb-dea8d1e2123b" (UID: "535566fd-2a2e-43a4-94cb-dea8d1e2123b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.626678 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535566fd-2a2e-43a4-94cb-dea8d1e2123b-kube-api-access-7tkmt" (OuterVolumeSpecName: "kube-api-access-7tkmt") pod "535566fd-2a2e-43a4-94cb-dea8d1e2123b" (UID: "535566fd-2a2e-43a4-94cb-dea8d1e2123b"). InnerVolumeSpecName "kube-api-access-7tkmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.626778 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535566fd-2a2e-43a4-94cb-dea8d1e2123b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "535566fd-2a2e-43a4-94cb-dea8d1e2123b" (UID: "535566fd-2a2e-43a4-94cb-dea8d1e2123b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.628299 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-b72ms" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.683837 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:20:13 crc kubenswrapper[4778]: E0930 17:20:13.684148 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535566fd-2a2e-43a4-94cb-dea8d1e2123b" containerName="collect-profiles" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.684169 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="535566fd-2a2e-43a4-94cb-dea8d1e2123b" containerName="collect-profiles" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.684287 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="535566fd-2a2e-43a4-94cb-dea8d1e2123b" containerName="collect-profiles" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.684812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.688064 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.701822 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.711016 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.718287 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/535566fd-2a2e-43a4-94cb-dea8d1e2123b-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.718310 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkmt\" (UniqueName: \"kubernetes.io/projected/535566fd-2a2e-43a4-94cb-dea8d1e2123b-kube-api-access-7tkmt\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.718319 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/535566fd-2a2e-43a4-94cb-dea8d1e2123b-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.721085 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.819364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13141512-8786-4c16-8cf6-b836d2b3158f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.819465 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13141512-8786-4c16-8cf6-b836d2b3158f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.921193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13141512-8786-4c16-8cf6-b836d2b3158f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.921289 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13141512-8786-4c16-8cf6-b836d2b3158f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.921361 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13141512-8786-4c16-8cf6-b836d2b3158f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:13 crc kubenswrapper[4778]: I0930 17:20:13.942135 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13141512-8786-4c16-8cf6-b836d2b3158f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.002362 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.065328 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.164230 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-xr85z container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.164665 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xr85z" podUID="22eb8492-05be-4e05-a4b8-34f965c014ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.164541 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-xr85z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.164831 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xr85z" podUID="22eb8492-05be-4e05-a4b8-34f965c014ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.188788 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ls2gx"] Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.197734 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.203444 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls2gx"] Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.211144 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.313893 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.321289 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-26ztf" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.328435 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-catalog-content\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.328583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dz4c\" (UniqueName: \"kubernetes.io/projected/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-kube-api-access-8dz4c\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.328621 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-utilities\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.343243 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.373199 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:14 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:14 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:14 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.373266 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.380311 4778 generic.go:334] "Generic (PLEG): container finished" podID="e159da09-2a9f-4472-acca-abe0193feb9f" containerID="0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0" exitCode=0 Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.380413 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerDied","Data":"0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.384100 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.386381 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.399974 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" event={"ID":"452a4879-b0bd-490e-bffb-25f8404a6eac","Type":"ContainerStarted","Data":"999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.425375 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.425675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c" event={"ID":"535566fd-2a2e-43a4-94cb-dea8d1e2123b","Type":"ContainerDied","Data":"be420535acc7a937d1dc95b340608a503777d0c1a923edd7ff566f98511e6db1"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.425703 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be420535acc7a937d1dc95b340608a503777d0c1a923edd7ff566f98511e6db1" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.433930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-utilities\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.434370 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-catalog-content\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.434548 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dz4c\" (UniqueName: \"kubernetes.io/projected/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-kube-api-access-8dz4c\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.434867 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-utilities\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.437394 4778 generic.go:334] "Generic (PLEG): container finished" podID="2654d449-8673-41a9-b2a9-c5f986819740" containerID="8a7c82884b28f147a8c5d3048ff2112d1d84094812b70176ff306f164b2752ae" exitCode=0 Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.437548 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbvnt" event={"ID":"2654d449-8673-41a9-b2a9-c5f986819740","Type":"ContainerDied","Data":"8a7c82884b28f147a8c5d3048ff2112d1d84094812b70176ff306f164b2752ae"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.437582 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbvnt" event={"ID":"2654d449-8673-41a9-b2a9-c5f986819740","Type":"ContainerStarted","Data":"81f89580731a50a28f41f8c3a9c69b99252f22d109f20cab6bc4db628a369050"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.445412 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-catalog-content\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.453297 4778 generic.go:334] "Generic (PLEG): container finished" podID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerID="0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8" exitCode=0 Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.453430 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhbnw" event={"ID":"e90450b5-cbd1-44fd-9fd6-7def6ea75b33","Type":"ContainerDied","Data":"0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.477995 4778 generic.go:334] "Generic (PLEG): container finished" podID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerID="87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd" exitCode=0 Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.478145 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerDied","Data":"87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.484336 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.502573 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dz4c\" (UniqueName: \"kubernetes.io/projected/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-kube-api-access-8dz4c\") pod \"redhat-marketplace-ls2gx\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.517659 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c","Type":"ContainerStarted","Data":"03e9552f583c1fb04b67d34699a0de48921313ef78302ce90679b239bc33e411"} Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.520388 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.595216 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-42qm2"] Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.596409 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.622500 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42qm2"] Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.747323 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78dtx\" (UniqueName: \"kubernetes.io/projected/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-kube-api-access-78dtx\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.747893 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-catalog-content\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.747927 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-utilities\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.812360 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.812428 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.850108 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-catalog-content\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.850207 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-utilities\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.850246 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78dtx\" (UniqueName: \"kubernetes.io/projected/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-kube-api-access-78dtx\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.851135 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-catalog-content\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.851447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-utilities\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.876750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78dtx\" (UniqueName: \"kubernetes.io/projected/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-kube-api-access-78dtx\") pod \"redhat-marketplace-42qm2\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.921764 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.923124 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.939837 4778 patch_prober.go:28] interesting pod/console-f9d7485db-pl4fp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.939916 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pl4fp" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.953206 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.967596 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls2gx"] Sep 30 17:20:14 crc kubenswrapper[4778]: I0930 17:20:14.995858 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-hm822" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.179375 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hpfsj"] Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.183921 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.190446 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.198839 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hpfsj"] Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.263570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-utilities\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.263662 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6p5\" (UniqueName: \"kubernetes.io/projected/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-kube-api-access-9r6p5\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.263747 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-catalog-content\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.364569 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-utilities\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.365069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6p5\" (UniqueName: \"kubernetes.io/projected/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-kube-api-access-9r6p5\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.365111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-catalog-content\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.365676 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-catalog-content\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.364865 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:15 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:15 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:15 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.365729 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-utilities\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.365746 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.431167 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6p5\" (UniqueName: \"kubernetes.io/projected/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-kube-api-access-9r6p5\") pod \"redhat-operators-hpfsj\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.512190 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-42qm2"] Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.512544 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.533716 4778 generic.go:334] "Generic (PLEG): container finished" podID="7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c" containerID="03e9552f583c1fb04b67d34699a0de48921313ef78302ce90679b239bc33e411" exitCode=0 Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.534219 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c","Type":"ContainerDied","Data":"03e9552f583c1fb04b67d34699a0de48921313ef78302ce90679b239bc33e411"} Sep 30 17:20:15 crc kubenswrapper[4778]: W0930 17:20:15.548129 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d747b8a_a90d_4f5b_8431_cbe604c9ee7e.slice/crio-10bba515450df2c9ca5de7a2a90580f7c2e6189af202026b2581891e16247cb1 WatchSource:0}: Error finding container 10bba515450df2c9ca5de7a2a90580f7c2e6189af202026b2581891e16247cb1: Status 404 returned error can't find the container with id 10bba515450df2c9ca5de7a2a90580f7c2e6189af202026b2581891e16247cb1 Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.550371 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"13141512-8786-4c16-8cf6-b836d2b3158f","Type":"ContainerStarted","Data":"998acbdd428d6609dda1b979a8a1b4435812d3c33115e69f1b741865d5db36ee"} Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.550419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"13141512-8786-4c16-8cf6-b836d2b3158f","Type":"ContainerStarted","Data":"b7d4c5b7a0f262fe267c9363930ef19f226551746ddd3db38afe0ca9fb7894c5"} Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.607426 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerStarted","Data":"2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14"} Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.607500 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerStarted","Data":"8e784aba9551465db2204ff462b724cb5363f99e414cf3b019f3d300be025772"} Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.608570 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.618671 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vqhkf"] Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.624816 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.630630 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqhkf"] Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.638839 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.6388021 podStartE2EDuration="2.6388021s" podCreationTimestamp="2025-09-30 17:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:15.588512175 +0000 UTC m=+154.578409978" watchObservedRunningTime="2025-09-30 17:20:15.6388021 +0000 UTC m=+154.628699903" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.645788 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" podStartSLOduration=132.645754128 podStartE2EDuration="2m12.645754128s" podCreationTimestamp="2025-09-30 17:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:15.639458406 +0000 UTC m=+154.629356219" watchObservedRunningTime="2025-09-30 17:20:15.645754128 +0000 UTC m=+154.635651931" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.774730 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g54wt\" (UniqueName: \"kubernetes.io/projected/7a3d4534-9878-41a4-b950-5e671d5e5cdb-kube-api-access-g54wt\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.775280 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-catalog-content\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.775360 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-utilities\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.877534 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-catalog-content\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.877603 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-utilities\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.877646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g54wt\" (UniqueName: \"kubernetes.io/projected/7a3d4534-9878-41a4-b950-5e671d5e5cdb-kube-api-access-g54wt\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.878355 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-catalog-content\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.878695 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-utilities\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.892661 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hpfsj"] Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.923917 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g54wt\" (UniqueName: \"kubernetes.io/projected/7a3d4534-9878-41a4-b950-5e671d5e5cdb-kube-api-access-g54wt\") pod \"redhat-operators-vqhkf\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:15 crc kubenswrapper[4778]: I0930 17:20:15.978495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.270261 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vqhkf"] Sep 30 17:20:16 crc kubenswrapper[4778]: W0930 17:20:16.321452 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3d4534_9878_41a4_b950_5e671d5e5cdb.slice/crio-347ad9bcad8908a6e77e318b1e8ce381687fb31a3b47a5854fa5212946c81923 WatchSource:0}: Error finding container 347ad9bcad8908a6e77e318b1e8ce381687fb31a3b47a5854fa5212946c81923: Status 404 returned error can't find the container with id 347ad9bcad8908a6e77e318b1e8ce381687fb31a3b47a5854fa5212946c81923 Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.363963 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:16 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:16 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:16 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.364050 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.622526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqhkf" event={"ID":"7a3d4534-9878-41a4-b950-5e671d5e5cdb","Type":"ContainerStarted","Data":"347ad9bcad8908a6e77e318b1e8ce381687fb31a3b47a5854fa5212946c81923"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.625596 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerID="fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe" exitCode=0 Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.625726 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerDied","Data":"fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.625785 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerStarted","Data":"4f4a68faff9d0a98c2db27d0a1a9ad05c99829285659266896535e951782bbde"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.643464 4778 generic.go:334] "Generic (PLEG): container finished" podID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerID="5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433" exitCode=0 Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.643573 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42qm2" event={"ID":"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e","Type":"ContainerDied","Data":"5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.643704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42qm2" event={"ID":"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e","Type":"ContainerStarted","Data":"10bba515450df2c9ca5de7a2a90580f7c2e6189af202026b2581891e16247cb1"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.653235 4778 generic.go:334] "Generic (PLEG): container finished" podID="13141512-8786-4c16-8cf6-b836d2b3158f" containerID="998acbdd428d6609dda1b979a8a1b4435812d3c33115e69f1b741865d5db36ee" exitCode=0 Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.653489 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"13141512-8786-4c16-8cf6-b836d2b3158f","Type":"ContainerDied","Data":"998acbdd428d6609dda1b979a8a1b4435812d3c33115e69f1b741865d5db36ee"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.660072 4778 generic.go:334] "Generic (PLEG): container finished" podID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerID="2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14" exitCode=0 Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.660404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerDied","Data":"2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14"} Sep 30 17:20:16 crc kubenswrapper[4778]: I0930 17:20:16.993872 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.104365 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kube-api-access\") pod \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.105520 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kubelet-dir\") pod \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\" (UID: \"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c\") " Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.105912 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c" (UID: "7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.119609 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c" (UID: "7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.207280 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.207329 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.364007 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:17 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:17 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:17 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.364098 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.699539 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerID="c26c738d3ea980e510c6dbae3bc0c82a7b0769de27243a7b5451aaa656cb6750" exitCode=0 Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.699921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqhkf" event={"ID":"7a3d4534-9878-41a4-b950-5e671d5e5cdb","Type":"ContainerDied","Data":"c26c738d3ea980e510c6dbae3bc0c82a7b0769de27243a7b5451aaa656cb6750"} Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.711296 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.711694 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c","Type":"ContainerDied","Data":"5246a3e014f554dad48236b613bfa230c6ef40b0d6eb0ebf921338e2627a2caa"} Sep 30 17:20:17 crc kubenswrapper[4778]: I0930 17:20:17.711762 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5246a3e014f554dad48236b613bfa230c6ef40b0d6eb0ebf921338e2627a2caa" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.116538 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.239746 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13141512-8786-4c16-8cf6-b836d2b3158f-kube-api-access\") pod \"13141512-8786-4c16-8cf6-b836d2b3158f\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.239814 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13141512-8786-4c16-8cf6-b836d2b3158f-kubelet-dir\") pod \"13141512-8786-4c16-8cf6-b836d2b3158f\" (UID: \"13141512-8786-4c16-8cf6-b836d2b3158f\") " Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.240000 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13141512-8786-4c16-8cf6-b836d2b3158f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "13141512-8786-4c16-8cf6-b836d2b3158f" (UID: "13141512-8786-4c16-8cf6-b836d2b3158f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.240354 4778 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/13141512-8786-4c16-8cf6-b836d2b3158f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.255382 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13141512-8786-4c16-8cf6-b836d2b3158f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "13141512-8786-4c16-8cf6-b836d2b3158f" (UID: "13141512-8786-4c16-8cf6-b836d2b3158f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.342562 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/13141512-8786-4c16-8cf6-b836d2b3158f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.365169 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:18 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:18 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:18 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.365229 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.754895 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"13141512-8786-4c16-8cf6-b836d2b3158f","Type":"ContainerDied","Data":"b7d4c5b7a0f262fe267c9363930ef19f226551746ddd3db38afe0ca9fb7894c5"} Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.754963 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7d4c5b7a0f262fe267c9363930ef19f226551746ddd3db38afe0ca9fb7894c5" Sep 30 17:20:18 crc kubenswrapper[4778]: I0930 17:20:18.755075 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 17:20:19 crc kubenswrapper[4778]: I0930 17:20:19.362750 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:19 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:19 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:19 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:19 crc kubenswrapper[4778]: I0930 17:20:19.363525 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:19 crc kubenswrapper[4778]: I0930 17:20:19.451406 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-558p9" Sep 30 17:20:20 crc kubenswrapper[4778]: I0930 17:20:20.361526 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:20 crc kubenswrapper[4778]: [-]has-synced failed: reason withheld Sep 30 17:20:20 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:20 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:20 crc kubenswrapper[4778]: I0930 17:20:20.361595 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:21 crc kubenswrapper[4778]: I0930 17:20:21.362089 4778 patch_prober.go:28] interesting pod/router-default-5444994796-svds2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 17:20:21 crc kubenswrapper[4778]: [+]has-synced ok Sep 30 17:20:21 crc kubenswrapper[4778]: [+]process-running ok Sep 30 17:20:21 crc kubenswrapper[4778]: healthz check failed Sep 30 17:20:21 crc kubenswrapper[4778]: I0930 17:20:21.362848 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-svds2" podUID="4d85180d-59c0-4f8f-8481-170f27db08b6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 17:20:22 crc kubenswrapper[4778]: I0930 17:20:22.361850 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:22 crc kubenswrapper[4778]: I0930 17:20:22.365082 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-svds2" Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.162925 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-xr85z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.163840 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xr85z" podUID="22eb8492-05be-4e05-a4b8-34f965c014ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.163156 4778 patch_prober.go:28] interesting pod/downloads-7954f5f757-xr85z container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.164274 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xr85z" podUID="22eb8492-05be-4e05-a4b8-34f965c014ed" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.768361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.794691 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b0c73d9-9a75-4e65-9220-904133af63fd-metrics-certs\") pod \"network-metrics-daemon-l88vm\" (UID: \"8b0c73d9-9a75-4e65-9220-904133af63fd\") " pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.919947 4778 patch_prober.go:28] interesting pod/console-f9d7485db-pl4fp container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.920045 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-pl4fp" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerName="console" probeResult="failure" output="Get \"https://10.217.0.7:8443/health\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 17:20:24 crc kubenswrapper[4778]: I0930 17:20:24.962261 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l88vm" Sep 30 17:20:32 crc kubenswrapper[4778]: I0930 17:20:32.378356 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:20:34 crc kubenswrapper[4778]: I0930 17:20:34.180541 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xr85z" Sep 30 17:20:34 crc kubenswrapper[4778]: I0930 17:20:34.924479 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:34 crc kubenswrapper[4778]: I0930 17:20:34.930821 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:20:44 crc kubenswrapper[4778]: I0930 17:20:44.349788 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-klvfk" Sep 30 17:20:44 crc kubenswrapper[4778]: I0930 17:20:44.811943 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:20:44 crc kubenswrapper[4778]: I0930 17:20:44.812016 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:20:46 crc kubenswrapper[4778]: E0930 17:20:46.810100 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 17:20:46 crc kubenswrapper[4778]: E0930 17:20:46.810859 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k28cx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-lbvnt_openshift-marketplace(2654d449-8673-41a9-b2a9-c5f986819740): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:46 crc kubenswrapper[4778]: E0930 17:20:46.812109 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-lbvnt" podUID="2654d449-8673-41a9-b2a9-c5f986819740" Sep 30 17:20:49 crc kubenswrapper[4778]: E0930 17:20:49.735098 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Sep 30 17:20:49 crc kubenswrapper[4778]: E0930 17:20:49.735738 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9r6p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-hpfsj_openshift-marketplace(bb93f0e3-e3b6-41bc-af18-91e3107fc79a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:49 crc kubenswrapper[4778]: E0930 17:20:49.736944 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-hpfsj" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" Sep 30 17:20:49 crc kubenswrapper[4778]: I0930 17:20:49.983632 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 17:20:50 crc kubenswrapper[4778]: E0930 17:20:50.423486 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-hpfsj" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" Sep 30 17:20:50 crc kubenswrapper[4778]: E0930 17:20:50.494697 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 17:20:50 crc kubenswrapper[4778]: E0930 17:20:50.494892 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8dz4c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-ls2gx_openshift-marketplace(dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:50 crc kubenswrapper[4778]: E0930 17:20:50.496136 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-ls2gx" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" Sep 30 17:20:51 crc kubenswrapper[4778]: E0930 17:20:51.750834 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-ls2gx" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" Sep 30 17:20:51 crc kubenswrapper[4778]: E0930 17:20:51.834434 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 17:20:51 crc kubenswrapper[4778]: E0930 17:20:51.834630 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7w9zg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-rhbnw_openshift-marketplace(e90450b5-cbd1-44fd-9fd6-7def6ea75b33): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:51 crc kubenswrapper[4778]: E0930 17:20:51.835928 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-rhbnw" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" Sep 30 17:20:51 crc kubenswrapper[4778]: E0930 17:20:51.996511 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-rhbnw" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.121718 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.121934 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p7wj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-bcbgk_openshift-marketplace(e159da09-2a9f-4472-acca-abe0193feb9f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.123179 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-bcbgk" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.177766 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.177976 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5dgmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-wfx99_openshift-marketplace(7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.180070 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-wfx99" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.185664 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.185844 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78dtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-42qm2_openshift-marketplace(7d747b8a-a90d-4f5b-8431-cbe604c9ee7e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Sep 30 17:20:52 crc kubenswrapper[4778]: E0930 17:20:52.188097 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-42qm2" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" Sep 30 17:20:52 crc kubenswrapper[4778]: I0930 17:20:52.188390 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l88vm"] Sep 30 17:20:52 crc kubenswrapper[4778]: W0930 17:20:52.199283 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b0c73d9_9a75_4e65_9220_904133af63fd.slice/crio-418c864b8a3ad55ce3da6fc011d303e1f9cf42d65b8db81d500bd854d70dc4f8 WatchSource:0}: Error finding container 418c864b8a3ad55ce3da6fc011d303e1f9cf42d65b8db81d500bd854d70dc4f8: Status 404 returned error can't find the container with id 418c864b8a3ad55ce3da6fc011d303e1f9cf42d65b8db81d500bd854d70dc4f8 Sep 30 17:20:53 crc kubenswrapper[4778]: I0930 17:20:53.000928 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l88vm" event={"ID":"8b0c73d9-9a75-4e65-9220-904133af63fd","Type":"ContainerStarted","Data":"fdee6041952c7367a00b9023b6ed7af02c4dbe269b3847e97d5c30f7f0d98f0e"} Sep 30 17:20:53 crc kubenswrapper[4778]: I0930 17:20:53.001937 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l88vm" event={"ID":"8b0c73d9-9a75-4e65-9220-904133af63fd","Type":"ContainerStarted","Data":"418c864b8a3ad55ce3da6fc011d303e1f9cf42d65b8db81d500bd854d70dc4f8"} Sep 30 17:20:53 crc kubenswrapper[4778]: E0930 17:20:53.002807 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-wfx99" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" Sep 30 17:20:53 crc kubenswrapper[4778]: E0930 17:20:53.002977 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-bcbgk" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" Sep 30 17:20:53 crc kubenswrapper[4778]: E0930 17:20:53.003659 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-42qm2" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" Sep 30 17:20:54 crc kubenswrapper[4778]: I0930 17:20:54.009882 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerID="541fb6c72d78a19931b2f90432c8701521f0acacf7ea88406a44916361e0b5ab" exitCode=0 Sep 30 17:20:54 crc kubenswrapper[4778]: I0930 17:20:54.009976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqhkf" event={"ID":"7a3d4534-9878-41a4-b950-5e671d5e5cdb","Type":"ContainerDied","Data":"541fb6c72d78a19931b2f90432c8701521f0acacf7ea88406a44916361e0b5ab"} Sep 30 17:20:54 crc kubenswrapper[4778]: I0930 17:20:54.013468 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l88vm" event={"ID":"8b0c73d9-9a75-4e65-9220-904133af63fd","Type":"ContainerStarted","Data":"a7848161280ce883ab82619f5d5a47f0f2142a165059683ecbfc6ef4c3a74804"} Sep 30 17:20:55 crc kubenswrapper[4778]: I0930 17:20:55.027593 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqhkf" event={"ID":"7a3d4534-9878-41a4-b950-5e671d5e5cdb","Type":"ContainerStarted","Data":"55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a"} Sep 30 17:20:55 crc kubenswrapper[4778]: I0930 17:20:55.050903 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-l88vm" podStartSLOduration=173.050877973 podStartE2EDuration="2m53.050877973s" podCreationTimestamp="2025-09-30 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:20:54.050496938 +0000 UTC m=+193.040394921" watchObservedRunningTime="2025-09-30 17:20:55.050877973 +0000 UTC m=+194.040775776" Sep 30 17:20:55 crc kubenswrapper[4778]: I0930 17:20:55.051609 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vqhkf" podStartSLOduration=2.951613664 podStartE2EDuration="40.051604407s" podCreationTimestamp="2025-09-30 17:20:15 +0000 UTC" firstStartedPulling="2025-09-30 17:20:17.705065886 +0000 UTC m=+156.694963689" lastFinishedPulling="2025-09-30 17:20:54.805056629 +0000 UTC m=+193.794954432" observedRunningTime="2025-09-30 17:20:55.048012535 +0000 UTC m=+194.037910348" watchObservedRunningTime="2025-09-30 17:20:55.051604407 +0000 UTC m=+194.041502210" Sep 30 17:20:55 crc kubenswrapper[4778]: I0930 17:20:55.979318 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:55 crc kubenswrapper[4778]: I0930 17:20:55.980133 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:20:57 crc kubenswrapper[4778]: I0930 17:20:57.415574 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vqhkf" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="registry-server" probeResult="failure" output=< Sep 30 17:20:57 crc kubenswrapper[4778]: timeout: failed to connect service ":50051" within 1s Sep 30 17:20:57 crc kubenswrapper[4778]: > Sep 30 17:21:04 crc kubenswrapper[4778]: I0930 17:21:04.079749 4778 generic.go:334] "Generic (PLEG): container finished" podID="2654d449-8673-41a9-b2a9-c5f986819740" containerID="7105106f5ae633bb5f9b5efc94f20df600287688631a66defbb2848af8cdb6d5" exitCode=0 Sep 30 17:21:04 crc kubenswrapper[4778]: I0930 17:21:04.079853 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbvnt" event={"ID":"2654d449-8673-41a9-b2a9-c5f986819740","Type":"ContainerDied","Data":"7105106f5ae633bb5f9b5efc94f20df600287688631a66defbb2848af8cdb6d5"} Sep 30 17:21:05 crc kubenswrapper[4778]: I0930 17:21:05.090163 4778 generic.go:334] "Generic (PLEG): container finished" podID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerID="d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507" exitCode=0 Sep 30 17:21:05 crc kubenswrapper[4778]: I0930 17:21:05.090302 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerDied","Data":"d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507"} Sep 30 17:21:06 crc kubenswrapper[4778]: I0930 17:21:06.099557 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerStarted","Data":"b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3"} Sep 30 17:21:06 crc kubenswrapper[4778]: I0930 17:21:06.103552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbvnt" event={"ID":"2654d449-8673-41a9-b2a9-c5f986819740","Type":"ContainerStarted","Data":"e1dd6312191b225494d62b08f29827cbc2f45024e9a686f6563d5e6505f8ee9f"} Sep 30 17:21:06 crc kubenswrapper[4778]: I0930 17:21:06.144804 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lbvnt" podStartSLOduration=4.404769771 podStartE2EDuration="54.144784176s" podCreationTimestamp="2025-09-30 17:20:12 +0000 UTC" firstStartedPulling="2025-09-30 17:20:15.618079323 +0000 UTC m=+154.607977126" lastFinishedPulling="2025-09-30 17:21:05.358093728 +0000 UTC m=+204.347991531" observedRunningTime="2025-09-30 17:21:06.143889915 +0000 UTC m=+205.133787718" watchObservedRunningTime="2025-09-30 17:21:06.144784176 +0000 UTC m=+205.134681979" Sep 30 17:21:06 crc kubenswrapper[4778]: I0930 17:21:06.189065 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:21:06 crc kubenswrapper[4778]: I0930 17:21:06.238567 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:21:07 crc kubenswrapper[4778]: I0930 17:21:07.548336 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqhkf"] Sep 30 17:21:08 crc kubenswrapper[4778]: I0930 17:21:08.114011 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerID="b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3" exitCode=0 Sep 30 17:21:08 crc kubenswrapper[4778]: I0930 17:21:08.114092 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerDied","Data":"b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3"} Sep 30 17:21:08 crc kubenswrapper[4778]: I0930 17:21:08.114253 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vqhkf" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="registry-server" containerID="cri-o://55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a" gracePeriod=2 Sep 30 17:21:12 crc kubenswrapper[4778]: I0930 17:21:12.140595 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerID="55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a" exitCode=0 Sep 30 17:21:12 crc kubenswrapper[4778]: I0930 17:21:12.140657 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqhkf" event={"ID":"7a3d4534-9878-41a4-b950-5e671d5e5cdb","Type":"ContainerDied","Data":"55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a"} Sep 30 17:21:12 crc kubenswrapper[4778]: I0930 17:21:12.923937 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:21:12 crc kubenswrapper[4778]: I0930 17:21:12.924390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:21:12 crc kubenswrapper[4778]: I0930 17:21:12.978769 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:21:13 crc kubenswrapper[4778]: I0930 17:21:13.189202 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:21:14 crc kubenswrapper[4778]: I0930 17:21:14.353561 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lbvnt"] Sep 30 17:21:14 crc kubenswrapper[4778]: I0930 17:21:14.812243 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:21:14 crc kubenswrapper[4778]: I0930 17:21:14.812352 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:21:14 crc kubenswrapper[4778]: I0930 17:21:14.812465 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:21:14 crc kubenswrapper[4778]: I0930 17:21:14.813505 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:21:14 crc kubenswrapper[4778]: I0930 17:21:14.813694 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505" gracePeriod=600 Sep 30 17:21:15 crc kubenswrapper[4778]: E0930 17:21:15.981325 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a is running failed: container process not found" containerID="55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:21:15 crc kubenswrapper[4778]: E0930 17:21:15.983802 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a is running failed: container process not found" containerID="55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:21:15 crc kubenswrapper[4778]: E0930 17:21:15.984647 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a is running failed: container process not found" containerID="55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:21:15 crc kubenswrapper[4778]: E0930 17:21:15.984723 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-vqhkf" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="registry-server" Sep 30 17:21:16 crc kubenswrapper[4778]: I0930 17:21:16.168005 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lbvnt" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="registry-server" containerID="cri-o://e1dd6312191b225494d62b08f29827cbc2f45024e9a686f6563d5e6505f8ee9f" gracePeriod=2 Sep 30 17:21:17 crc kubenswrapper[4778]: I0930 17:21:17.174563 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505" exitCode=0 Sep 30 17:21:17 crc kubenswrapper[4778]: I0930 17:21:17.174657 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505"} Sep 30 17:21:19 crc kubenswrapper[4778]: I0930 17:21:19.190097 4778 generic.go:334] "Generic (PLEG): container finished" podID="2654d449-8673-41a9-b2a9-c5f986819740" containerID="e1dd6312191b225494d62b08f29827cbc2f45024e9a686f6563d5e6505f8ee9f" exitCode=0 Sep 30 17:21:19 crc kubenswrapper[4778]: I0930 17:21:19.190176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbvnt" event={"ID":"2654d449-8673-41a9-b2a9-c5f986819740","Type":"ContainerDied","Data":"e1dd6312191b225494d62b08f29827cbc2f45024e9a686f6563d5e6505f8ee9f"} Sep 30 17:21:20 crc kubenswrapper[4778]: I0930 17:21:20.870476 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:21:20 crc kubenswrapper[4778]: I0930 17:21:20.914063 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-utilities\") pod \"2654d449-8673-41a9-b2a9-c5f986819740\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " Sep 30 17:21:20 crc kubenswrapper[4778]: I0930 17:21:20.914172 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k28cx\" (UniqueName: \"kubernetes.io/projected/2654d449-8673-41a9-b2a9-c5f986819740-kube-api-access-k28cx\") pod \"2654d449-8673-41a9-b2a9-c5f986819740\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " Sep 30 17:21:20 crc kubenswrapper[4778]: I0930 17:21:20.914239 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-catalog-content\") pod \"2654d449-8673-41a9-b2a9-c5f986819740\" (UID: \"2654d449-8673-41a9-b2a9-c5f986819740\") " Sep 30 17:21:20 crc kubenswrapper[4778]: I0930 17:21:20.918390 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-utilities" (OuterVolumeSpecName: "utilities") pod "2654d449-8673-41a9-b2a9-c5f986819740" (UID: "2654d449-8673-41a9-b2a9-c5f986819740"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:20 crc kubenswrapper[4778]: I0930 17:21:20.926863 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2654d449-8673-41a9-b2a9-c5f986819740-kube-api-access-k28cx" (OuterVolumeSpecName: "kube-api-access-k28cx") pod "2654d449-8673-41a9-b2a9-c5f986819740" (UID: "2654d449-8673-41a9-b2a9-c5f986819740"). InnerVolumeSpecName "kube-api-access-k28cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.016626 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k28cx\" (UniqueName: \"kubernetes.io/projected/2654d449-8673-41a9-b2a9-c5f986819740-kube-api-access-k28cx\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.016687 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.137893 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.204972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lbvnt" event={"ID":"2654d449-8673-41a9-b2a9-c5f986819740","Type":"ContainerDied","Data":"81f89580731a50a28f41f8c3a9c69b99252f22d109f20cab6bc4db628a369050"} Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.205076 4778 scope.go:117] "RemoveContainer" containerID="e1dd6312191b225494d62b08f29827cbc2f45024e9a686f6563d5e6505f8ee9f" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.204999 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lbvnt" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.207758 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vqhkf" event={"ID":"7a3d4534-9878-41a4-b950-5e671d5e5cdb","Type":"ContainerDied","Data":"347ad9bcad8908a6e77e318b1e8ce381687fb31a3b47a5854fa5212946c81923"} Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.207863 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vqhkf" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.218501 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g54wt\" (UniqueName: \"kubernetes.io/projected/7a3d4534-9878-41a4-b950-5e671d5e5cdb-kube-api-access-g54wt\") pod \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.218676 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-utilities\") pod \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.218773 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-catalog-content\") pod \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\" (UID: \"7a3d4534-9878-41a4-b950-5e671d5e5cdb\") " Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.219713 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-utilities" (OuterVolumeSpecName: "utilities") pod "7a3d4534-9878-41a4-b950-5e671d5e5cdb" (UID: "7a3d4534-9878-41a4-b950-5e671d5e5cdb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.229292 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3d4534-9878-41a4-b950-5e671d5e5cdb-kube-api-access-g54wt" (OuterVolumeSpecName: "kube-api-access-g54wt") pod "7a3d4534-9878-41a4-b950-5e671d5e5cdb" (UID: "7a3d4534-9878-41a4-b950-5e671d5e5cdb"). InnerVolumeSpecName "kube-api-access-g54wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.316548 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a3d4534-9878-41a4-b950-5e671d5e5cdb" (UID: "7a3d4534-9878-41a4-b950-5e671d5e5cdb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.320537 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.320573 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g54wt\" (UniqueName: \"kubernetes.io/projected/7a3d4534-9878-41a4-b950-5e671d5e5cdb-kube-api-access-g54wt\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.320590 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a3d4534-9878-41a4-b950-5e671d5e5cdb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.547531 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vqhkf"] Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.554276 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vqhkf"] Sep 30 17:21:21 crc kubenswrapper[4778]: I0930 17:21:21.732028 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" path="/var/lib/kubelet/pods/7a3d4534-9878-41a4-b950-5e671d5e5cdb/volumes" Sep 30 17:21:22 crc kubenswrapper[4778]: I0930 17:21:22.550868 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2654d449-8673-41a9-b2a9-c5f986819740" (UID: "2654d449-8673-41a9-b2a9-c5f986819740"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:22 crc kubenswrapper[4778]: I0930 17:21:22.637772 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2654d449-8673-41a9-b2a9-c5f986819740-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:22 crc kubenswrapper[4778]: I0930 17:21:22.737098 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lbvnt"] Sep 30 17:21:22 crc kubenswrapper[4778]: I0930 17:21:22.740809 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lbvnt"] Sep 30 17:21:23 crc kubenswrapper[4778]: I0930 17:21:23.683552 4778 scope.go:117] "RemoveContainer" containerID="7105106f5ae633bb5f9b5efc94f20df600287688631a66defbb2848af8cdb6d5" Sep 30 17:21:23 crc kubenswrapper[4778]: I0930 17:21:23.721357 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2654d449-8673-41a9-b2a9-c5f986819740" path="/var/lib/kubelet/pods/2654d449-8673-41a9-b2a9-c5f986819740/volumes" Sep 30 17:21:25 crc kubenswrapper[4778]: I0930 17:21:25.619740 4778 scope.go:117] "RemoveContainer" containerID="8a7c82884b28f147a8c5d3048ff2112d1d84094812b70176ff306f164b2752ae" Sep 30 17:21:26 crc kubenswrapper[4778]: I0930 17:21:26.336849 4778 scope.go:117] "RemoveContainer" containerID="55e3afba1b93673cf19cdb73a8628139b6b3668878da8014654625947645d83a" Sep 30 17:21:26 crc kubenswrapper[4778]: I0930 17:21:26.517125 4778 scope.go:117] "RemoveContainer" containerID="541fb6c72d78a19931b2f90432c8701521f0acacf7ea88406a44916361e0b5ab" Sep 30 17:21:27 crc kubenswrapper[4778]: I0930 17:21:27.627511 4778 scope.go:117] "RemoveContainer" containerID="c26c738d3ea980e510c6dbae3bc0c82a7b0769de27243a7b5451aaa656cb6750" Sep 30 17:21:29 crc kubenswrapper[4778]: I0930 17:21:29.285666 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"13a00401a21c43e81f99eb39c3b8e765d2e6610548614249a56f21e67091808b"} Sep 30 17:21:29 crc kubenswrapper[4778]: I0930 17:21:29.290167 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerStarted","Data":"8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55"} Sep 30 17:21:29 crc kubenswrapper[4778]: I0930 17:21:29.293488 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerStarted","Data":"a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81"} Sep 30 17:21:29 crc kubenswrapper[4778]: I0930 17:21:29.296205 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerStarted","Data":"332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371"} Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.302206 4778 generic.go:334] "Generic (PLEG): container finished" podID="e159da09-2a9f-4472-acca-abe0193feb9f" containerID="332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371" exitCode=0 Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.302397 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerDied","Data":"332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371"} Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.305502 4778 generic.go:334] "Generic (PLEG): container finished" podID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerID="b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c" exitCode=0 Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.305538 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhbnw" event={"ID":"e90450b5-cbd1-44fd-9fd6-7def6ea75b33","Type":"ContainerDied","Data":"b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c"} Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.307605 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerStarted","Data":"3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc"} Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.411495 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hpfsj" podStartSLOduration=4.420400642 podStartE2EDuration="1m15.411477193s" podCreationTimestamp="2025-09-30 17:20:15 +0000 UTC" firstStartedPulling="2025-09-30 17:20:16.629677527 +0000 UTC m=+155.619575330" lastFinishedPulling="2025-09-30 17:21:27.620754078 +0000 UTC m=+226.610651881" observedRunningTime="2025-09-30 17:21:30.410491769 +0000 UTC m=+229.400389572" watchObservedRunningTime="2025-09-30 17:21:30.411477193 +0000 UTC m=+229.401374986" Sep 30 17:21:30 crc kubenswrapper[4778]: I0930 17:21:30.429801 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ls2gx" podStartSLOduration=5.728593223 podStartE2EDuration="1m16.429784063s" podCreationTimestamp="2025-09-30 17:20:14 +0000 UTC" firstStartedPulling="2025-09-30 17:20:15.63592029 +0000 UTC m=+154.625818093" lastFinishedPulling="2025-09-30 17:21:26.33711113 +0000 UTC m=+225.327008933" observedRunningTime="2025-09-30 17:21:30.427099913 +0000 UTC m=+229.416997726" watchObservedRunningTime="2025-09-30 17:21:30.429784063 +0000 UTC m=+229.419681866" Sep 30 17:21:31 crc kubenswrapper[4778]: I0930 17:21:31.313786 4778 generic.go:334] "Generic (PLEG): container finished" podID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerID="4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0" exitCode=0 Sep 30 17:21:31 crc kubenswrapper[4778]: I0930 17:21:31.313989 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42qm2" event={"ID":"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e","Type":"ContainerDied","Data":"4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0"} Sep 30 17:21:31 crc kubenswrapper[4778]: I0930 17:21:31.316602 4778 generic.go:334] "Generic (PLEG): container finished" podID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerID="3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc" exitCode=0 Sep 30 17:21:31 crc kubenswrapper[4778]: I0930 17:21:31.316655 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerDied","Data":"3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc"} Sep 30 17:21:33 crc kubenswrapper[4778]: I0930 17:21:33.335813 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhbnw" event={"ID":"e90450b5-cbd1-44fd-9fd6-7def6ea75b33","Type":"ContainerStarted","Data":"0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5"} Sep 30 17:21:33 crc kubenswrapper[4778]: I0930 17:21:33.339659 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42qm2" event={"ID":"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e","Type":"ContainerStarted","Data":"270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e"} Sep 30 17:21:33 crc kubenswrapper[4778]: I0930 17:21:33.359936 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rhbnw" podStartSLOduration=3.499033502 podStartE2EDuration="1m21.359913133s" podCreationTimestamp="2025-09-30 17:20:12 +0000 UTC" firstStartedPulling="2025-09-30 17:20:14.458871906 +0000 UTC m=+153.448769709" lastFinishedPulling="2025-09-30 17:21:32.319751537 +0000 UTC m=+231.309649340" observedRunningTime="2025-09-30 17:21:33.357003545 +0000 UTC m=+232.346901348" watchObservedRunningTime="2025-09-30 17:21:33.359913133 +0000 UTC m=+232.349810936" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.349997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerStarted","Data":"1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587"} Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.352637 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerStarted","Data":"075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1"} Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.377005 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcbgk" podStartSLOduration=3.767779226 podStartE2EDuration="1m23.376987188s" podCreationTimestamp="2025-09-30 17:20:11 +0000 UTC" firstStartedPulling="2025-09-30 17:20:14.383830386 +0000 UTC m=+153.373728189" lastFinishedPulling="2025-09-30 17:21:33.993038348 +0000 UTC m=+232.982936151" observedRunningTime="2025-09-30 17:21:34.372979772 +0000 UTC m=+233.362877585" watchObservedRunningTime="2025-09-30 17:21:34.376987188 +0000 UTC m=+233.366884991" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.377380 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-42qm2" podStartSLOduration=4.299232287 podStartE2EDuration="1m20.377375111s" podCreationTimestamp="2025-09-30 17:20:14 +0000 UTC" firstStartedPulling="2025-09-30 17:20:16.645956374 +0000 UTC m=+155.635854177" lastFinishedPulling="2025-09-30 17:21:32.724099198 +0000 UTC m=+231.713997001" observedRunningTime="2025-09-30 17:21:33.378590846 +0000 UTC m=+232.368488649" watchObservedRunningTime="2025-09-30 17:21:34.377375111 +0000 UTC m=+233.367272924" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.393288 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wfx99" podStartSLOduration=3.712571816 podStartE2EDuration="1m22.39326918s" podCreationTimestamp="2025-09-30 17:20:12 +0000 UTC" firstStartedPulling="2025-09-30 17:20:14.486306652 +0000 UTC m=+153.476204455" lastFinishedPulling="2025-09-30 17:21:33.167004016 +0000 UTC m=+232.156901819" observedRunningTime="2025-09-30 17:21:34.39301079 +0000 UTC m=+233.382908613" watchObservedRunningTime="2025-09-30 17:21:34.39326918 +0000 UTC m=+233.383166983" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.521240 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.521323 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.561748 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.954261 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:21:34 crc kubenswrapper[4778]: I0930 17:21:34.954750 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:21:35 crc kubenswrapper[4778]: I0930 17:21:35.009230 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:21:35 crc kubenswrapper[4778]: I0930 17:21:35.401363 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:21:35 crc kubenswrapper[4778]: I0930 17:21:35.514103 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:21:35 crc kubenswrapper[4778]: I0930 17:21:35.514863 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:21:35 crc kubenswrapper[4778]: I0930 17:21:35.560317 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:21:36 crc kubenswrapper[4778]: I0930 17:21:36.411094 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:21:37 crc kubenswrapper[4778]: I0930 17:21:37.753424 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-56687"] Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.365223 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.365691 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.414936 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.459375 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.497908 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.497952 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.538563 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.685044 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.685121 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:21:42 crc kubenswrapper[4778]: I0930 17:21:42.731584 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:21:43 crc kubenswrapper[4778]: I0930 17:21:43.436972 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:21:43 crc kubenswrapper[4778]: I0930 17:21:43.446225 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:21:44 crc kubenswrapper[4778]: I0930 17:21:44.545823 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhbnw"] Sep 30 17:21:44 crc kubenswrapper[4778]: I0930 17:21:44.993037 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:21:45 crc kubenswrapper[4778]: I0930 17:21:45.409357 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rhbnw" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="registry-server" containerID="cri-o://0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5" gracePeriod=2 Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.322038 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.415463 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w9zg\" (UniqueName: \"kubernetes.io/projected/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-kube-api-access-7w9zg\") pod \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.415898 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-utilities\") pod \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.415981 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-catalog-content\") pod \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\" (UID: \"e90450b5-cbd1-44fd-9fd6-7def6ea75b33\") " Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.416732 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-utilities" (OuterVolumeSpecName: "utilities") pod "e90450b5-cbd1-44fd-9fd6-7def6ea75b33" (UID: "e90450b5-cbd1-44fd-9fd6-7def6ea75b33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.416795 4778 generic.go:334] "Generic (PLEG): container finished" podID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerID="0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5" exitCode=0 Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.416830 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhbnw" event={"ID":"e90450b5-cbd1-44fd-9fd6-7def6ea75b33","Type":"ContainerDied","Data":"0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5"} Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.416855 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rhbnw" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.416865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rhbnw" event={"ID":"e90450b5-cbd1-44fd-9fd6-7def6ea75b33","Type":"ContainerDied","Data":"4fe2395e32335a557e1b789dedc4ed922ff912fd81500b39ae594dc0e5f13647"} Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.416886 4778 scope.go:117] "RemoveContainer" containerID="0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.430838 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-kube-api-access-7w9zg" (OuterVolumeSpecName: "kube-api-access-7w9zg") pod "e90450b5-cbd1-44fd-9fd6-7def6ea75b33" (UID: "e90450b5-cbd1-44fd-9fd6-7def6ea75b33"). InnerVolumeSpecName "kube-api-access-7w9zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.432653 4778 scope.go:117] "RemoveContainer" containerID="b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.457553 4778 scope.go:117] "RemoveContainer" containerID="0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.472308 4778 scope.go:117] "RemoveContainer" containerID="0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5" Sep 30 17:21:46 crc kubenswrapper[4778]: E0930 17:21:46.472742 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5\": container with ID starting with 0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5 not found: ID does not exist" containerID="0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.472773 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e90450b5-cbd1-44fd-9fd6-7def6ea75b33" (UID: "e90450b5-cbd1-44fd-9fd6-7def6ea75b33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.472791 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5"} err="failed to get container status \"0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5\": rpc error: code = NotFound desc = could not find container \"0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5\": container with ID starting with 0afcec66214afae79c73efd76952ca1ef73091e44c0842b6640d8725f8e994c5 not found: ID does not exist" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.472894 4778 scope.go:117] "RemoveContainer" containerID="b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c" Sep 30 17:21:46 crc kubenswrapper[4778]: E0930 17:21:46.473218 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c\": container with ID starting with b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c not found: ID does not exist" containerID="b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.473243 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c"} err="failed to get container status \"b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c\": rpc error: code = NotFound desc = could not find container \"b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c\": container with ID starting with b9b5ac38c2ff94b2a26931805d6eae80f2e59d0325e1078c91ab591e1218e50c not found: ID does not exist" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.473262 4778 scope.go:117] "RemoveContainer" containerID="0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8" Sep 30 17:21:46 crc kubenswrapper[4778]: E0930 17:21:46.473607 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8\": container with ID starting with 0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8 not found: ID does not exist" containerID="0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.473692 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8"} err="failed to get container status \"0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8\": rpc error: code = NotFound desc = could not find container \"0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8\": container with ID starting with 0cc17a6330ca0e1ec7eab55698ac3c9deb9e5b46cade675946a70ee51a5654c8 not found: ID does not exist" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.517320 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.517346 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w9zg\" (UniqueName: \"kubernetes.io/projected/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-kube-api-access-7w9zg\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.517357 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e90450b5-cbd1-44fd-9fd6-7def6ea75b33-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.745448 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rhbnw"] Sep 30 17:21:46 crc kubenswrapper[4778]: I0930 17:21:46.748099 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rhbnw"] Sep 30 17:21:47 crc kubenswrapper[4778]: I0930 17:21:47.719156 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" path="/var/lib/kubelet/pods/e90450b5-cbd1-44fd-9fd6-7def6ea75b33/volumes" Sep 30 17:21:48 crc kubenswrapper[4778]: I0930 17:21:48.746729 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42qm2"] Sep 30 17:21:48 crc kubenswrapper[4778]: I0930 17:21:48.746990 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-42qm2" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="registry-server" containerID="cri-o://270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e" gracePeriod=2 Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.093598 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.252502 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-catalog-content\") pod \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.252598 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78dtx\" (UniqueName: \"kubernetes.io/projected/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-kube-api-access-78dtx\") pod \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.252656 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-utilities\") pod \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\" (UID: \"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e\") " Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.253859 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-utilities" (OuterVolumeSpecName: "utilities") pod "7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" (UID: "7d747b8a-a90d-4f5b-8431-cbe604c9ee7e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.256668 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-kube-api-access-78dtx" (OuterVolumeSpecName: "kube-api-access-78dtx") pod "7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" (UID: "7d747b8a-a90d-4f5b-8431-cbe604c9ee7e"). InnerVolumeSpecName "kube-api-access-78dtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.265119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" (UID: "7d747b8a-a90d-4f5b-8431-cbe604c9ee7e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.355062 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78dtx\" (UniqueName: \"kubernetes.io/projected/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-kube-api-access-78dtx\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.355118 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.355135 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.437938 4778 generic.go:334] "Generic (PLEG): container finished" podID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerID="270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e" exitCode=0 Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.438005 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42qm2" event={"ID":"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e","Type":"ContainerDied","Data":"270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e"} Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.438083 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-42qm2" event={"ID":"7d747b8a-a90d-4f5b-8431-cbe604c9ee7e","Type":"ContainerDied","Data":"10bba515450df2c9ca5de7a2a90580f7c2e6189af202026b2581891e16247cb1"} Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.438095 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-42qm2" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.438109 4778 scope.go:117] "RemoveContainer" containerID="270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.461438 4778 scope.go:117] "RemoveContainer" containerID="4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.473717 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-42qm2"] Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.476430 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-42qm2"] Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.504235 4778 scope.go:117] "RemoveContainer" containerID="5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.520120 4778 scope.go:117] "RemoveContainer" containerID="270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e" Sep 30 17:21:49 crc kubenswrapper[4778]: E0930 17:21:49.520736 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e\": container with ID starting with 270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e not found: ID does not exist" containerID="270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.520784 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e"} err="failed to get container status \"270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e\": rpc error: code = NotFound desc = could not find container \"270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e\": container with ID starting with 270e9992fb03a647be9ab0ebb08bcb1dd231976958b35fc2ab321399017ecc6e not found: ID does not exist" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.520816 4778 scope.go:117] "RemoveContainer" containerID="4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0" Sep 30 17:21:49 crc kubenswrapper[4778]: E0930 17:21:49.521720 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0\": container with ID starting with 4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0 not found: ID does not exist" containerID="4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.521757 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0"} err="failed to get container status \"4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0\": rpc error: code = NotFound desc = could not find container \"4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0\": container with ID starting with 4dc0d87f9fe13d371785e3838c807cd3b610e5d439c77454f2e0b2b2b72147e0 not found: ID does not exist" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.521786 4778 scope.go:117] "RemoveContainer" containerID="5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433" Sep 30 17:21:49 crc kubenswrapper[4778]: E0930 17:21:49.522359 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433\": container with ID starting with 5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433 not found: ID does not exist" containerID="5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.522438 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433"} err="failed to get container status \"5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433\": rpc error: code = NotFound desc = could not find container \"5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433\": container with ID starting with 5b1de6f4f8bfc0f78160f49d1565557b0a8af72f43961718687018879211f433 not found: ID does not exist" Sep 30 17:21:49 crc kubenswrapper[4778]: I0930 17:21:49.722457 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" path="/var/lib/kubelet/pods/7d747b8a-a90d-4f5b-8431-cbe604c9ee7e/volumes" Sep 30 17:22:02 crc kubenswrapper[4778]: I0930 17:22:02.787242 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-56687" podUID="e6175301-9b87-47be-8c95-a4ce7fa0a413" containerName="oauth-openshift" containerID="cri-o://95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a" gracePeriod=15 Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.143334 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.178872 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8495797ccf-djk8r"] Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179070 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c" containerName="pruner" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179083 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c" containerName="pruner" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179092 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179099 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179107 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179114 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179120 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179125 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179133 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179139 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179146 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13141512-8786-4c16-8cf6-b836d2b3158f" containerName="pruner" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179153 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="13141512-8786-4c16-8cf6-b836d2b3158f" containerName="pruner" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179164 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179170 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179179 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6175301-9b87-47be-8c95-a4ce7fa0a413" containerName="oauth-openshift" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179184 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6175301-9b87-47be-8c95-a4ce7fa0a413" containerName="oauth-openshift" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179192 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179197 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179207 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179213 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179220 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179226 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179234 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179240 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179249 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179254 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="extract-content" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179263 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179269 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="extract-utilities" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.179282 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179288 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179371 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eb6c208-5d43-4c62-bdba-d4cf4e4aef3c" containerName="pruner" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179382 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2654d449-8673-41a9-b2a9-c5f986819740" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179390 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a3d4534-9878-41a4-b950-5e671d5e5cdb" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179396 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90450b5-cbd1-44fd-9fd6-7def6ea75b33" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179402 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="13141512-8786-4c16-8cf6-b836d2b3158f" containerName="pruner" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179411 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6175301-9b87-47be-8c95-a4ce7fa0a413" containerName="oauth-openshift" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179417 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d747b8a-a90d-4f5b-8431-cbe604c9ee7e" containerName="registry-server" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.179868 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.185871 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8495797ccf-djk8r"] Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.238382 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-login\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.238741 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-provider-selection\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.238896 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-router-certs\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239043 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-service-ca\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239186 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-error\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239271 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-dir\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239380 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfg8p\" (UniqueName: \"kubernetes.io/projected/e6175301-9b87-47be-8c95-a4ce7fa0a413-kube-api-access-bfg8p\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239475 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-trusted-ca-bundle\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239550 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-serving-cert\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239640 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-policies\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239736 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-session\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-ocp-branding-template\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239909 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-idp-0-file-data\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.239997 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-cliconfig\") pod \"e6175301-9b87-47be-8c95-a4ce7fa0a413\" (UID: \"e6175301-9b87-47be-8c95-a4ce7fa0a413\") " Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.241102 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.242237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.242657 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.243145 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.243266 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.246604 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.246872 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.247180 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6175301-9b87-47be-8c95-a4ce7fa0a413-kube-api-access-bfg8p" (OuterVolumeSpecName: "kube-api-access-bfg8p") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "kube-api-access-bfg8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.247402 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.247881 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.247988 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.251940 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.260043 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.260258 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e6175301-9b87-47be-8c95-a4ce7fa0a413" (UID: "e6175301-9b87-47be-8c95-a4ce7fa0a413"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.342964 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343065 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-login\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343255 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343318 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-session\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-error\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343426 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343444 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343504 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-audit-policies\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343525 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343555 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80eeda32-0634-4a29-96f5-4d9b438a9bd9-audit-dir\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343661 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343711 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.343816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgg4m\" (UniqueName: \"kubernetes.io/projected/80eeda32-0634-4a29-96f5-4d9b438a9bd9-kube-api-access-kgg4m\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344074 4778 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344098 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfg8p\" (UniqueName: \"kubernetes.io/projected/e6175301-9b87-47be-8c95-a4ce7fa0a413-kube-api-access-bfg8p\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344132 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344142 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344151 4778 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344162 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344174 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344184 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344194 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344204 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344214 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344224 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344233 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.344243 4778 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e6175301-9b87-47be-8c95-a4ce7fa0a413-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445360 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-error\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445421 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445446 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445477 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-audit-policies\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445507 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445537 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80eeda32-0634-4a29-96f5-4d9b438a9bd9-audit-dir\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445595 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/80eeda32-0634-4a29-96f5-4d9b438a9bd9-audit-dir\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.445912 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446040 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgg4m\" (UniqueName: \"kubernetes.io/projected/80eeda32-0634-4a29-96f5-4d9b438a9bd9-kube-api-access-kgg4m\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446093 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446148 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-login\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446241 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446401 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-session\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.446836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-service-ca\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.448297 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.448345 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.448915 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/80eeda32-0634-4a29-96f5-4d9b438a9bd9-audit-policies\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.449108 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.449338 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-session\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.450801 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-error\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.451607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.451671 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.451980 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-user-template-login\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.453834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-router-certs\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.454435 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/80eeda32-0634-4a29-96f5-4d9b438a9bd9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.472799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgg4m\" (UniqueName: \"kubernetes.io/projected/80eeda32-0634-4a29-96f5-4d9b438a9bd9-kube-api-access-kgg4m\") pod \"oauth-openshift-8495797ccf-djk8r\" (UID: \"80eeda32-0634-4a29-96f5-4d9b438a9bd9\") " pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.504814 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.518681 4778 generic.go:334] "Generic (PLEG): container finished" podID="e6175301-9b87-47be-8c95-a4ce7fa0a413" containerID="95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a" exitCode=0 Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.518727 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-56687" event={"ID":"e6175301-9b87-47be-8c95-a4ce7fa0a413","Type":"ContainerDied","Data":"95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a"} Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.518753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-56687" event={"ID":"e6175301-9b87-47be-8c95-a4ce7fa0a413","Type":"ContainerDied","Data":"5dfe25609eb4a37d12434609067eaeeb6504f9416e5b5ae024011a8da70405dd"} Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.518770 4778 scope.go:117] "RemoveContainer" containerID="95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.518800 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-56687" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.537707 4778 scope.go:117] "RemoveContainer" containerID="95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a" Sep 30 17:22:03 crc kubenswrapper[4778]: E0930 17:22:03.539244 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a\": container with ID starting with 95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a not found: ID does not exist" containerID="95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.539289 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a"} err="failed to get container status \"95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a\": rpc error: code = NotFound desc = could not find container \"95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a\": container with ID starting with 95f710ba8a212a81c8291b86ceecb5af6a9120a546220412b0d1fd62b04f578a not found: ID does not exist" Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.548258 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-56687"] Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.551467 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-56687"] Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.676271 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8495797ccf-djk8r"] Sep 30 17:22:03 crc kubenswrapper[4778]: I0930 17:22:03.720558 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6175301-9b87-47be-8c95-a4ce7fa0a413" path="/var/lib/kubelet/pods/e6175301-9b87-47be-8c95-a4ce7fa0a413/volumes" Sep 30 17:22:04 crc kubenswrapper[4778]: I0930 17:22:04.527030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" event={"ID":"80eeda32-0634-4a29-96f5-4d9b438a9bd9","Type":"ContainerStarted","Data":"34489435a19fb0bf45e2b7c5f476ed91aefad9eabecb58ccc3c4769e81b94cda"} Sep 30 17:22:04 crc kubenswrapper[4778]: I0930 17:22:04.527222 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" event={"ID":"80eeda32-0634-4a29-96f5-4d9b438a9bd9","Type":"ContainerStarted","Data":"a1c088ac8c83bc6ee8dddd725555743552dd973090b983bffe5d2497b328422c"} Sep 30 17:22:04 crc kubenswrapper[4778]: I0930 17:22:04.527238 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:04 crc kubenswrapper[4778]: I0930 17:22:04.531899 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" Sep 30 17:22:04 crc kubenswrapper[4778]: I0930 17:22:04.560168 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8495797ccf-djk8r" podStartSLOduration=27.560148524 podStartE2EDuration="27.560148524s" podCreationTimestamp="2025-09-30 17:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:22:04.559137576 +0000 UTC m=+263.549035399" watchObservedRunningTime="2025-09-30 17:22:04.560148524 +0000 UTC m=+263.550046327" Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.670468 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wfx99"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.673738 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wfx99" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="registry-server" containerID="cri-o://075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1" gracePeriod=30 Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.677732 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcbgk"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.677994 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcbgk" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="registry-server" containerID="cri-o://1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587" gracePeriod=30 Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.703333 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bjv8b"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.703584 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" containerID="cri-o://0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3" gracePeriod=30 Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.707255 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls2gx"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.707510 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ls2gx" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="registry-server" containerID="cri-o://8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55" gracePeriod=30 Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.719904 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hpfsj"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.720151 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hpfsj" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="registry-server" containerID="cri-o://a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81" gracePeriod=30 Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.724545 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9nzgv"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.725529 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.727982 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9nzgv"] Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.910002 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ms9n\" (UniqueName: \"kubernetes.io/projected/d46eace9-9047-4db1-acb9-1a588ab49434-kube-api-access-2ms9n\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.910106 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d46eace9-9047-4db1-acb9-1a588ab49434-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:16 crc kubenswrapper[4778]: I0930 17:22:16.910293 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d46eace9-9047-4db1-acb9-1a588ab49434-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.011427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ms9n\" (UniqueName: \"kubernetes.io/projected/d46eace9-9047-4db1-acb9-1a588ab49434-kube-api-access-2ms9n\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.011853 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d46eace9-9047-4db1-acb9-1a588ab49434-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.011885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d46eace9-9047-4db1-acb9-1a588ab49434-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.013463 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d46eace9-9047-4db1-acb9-1a588ab49434-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.019286 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/d46eace9-9047-4db1-acb9-1a588ab49434-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.035215 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ms9n\" (UniqueName: \"kubernetes.io/projected/d46eace9-9047-4db1-acb9-1a588ab49434-kube-api-access-2ms9n\") pod \"marketplace-operator-79b997595-9nzgv\" (UID: \"d46eace9-9047-4db1-acb9-1a588ab49434\") " pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.158700 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.185687 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.254966 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.276898 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.282320 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.287249 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.315345 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dgmf\" (UniqueName: \"kubernetes.io/projected/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-kube-api-access-5dgmf\") pod \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.315465 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-utilities\") pod \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.315505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-catalog-content\") pod \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\" (UID: \"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.322566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-utilities" (OuterVolumeSpecName: "utilities") pod "7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" (UID: "7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.329113 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-kube-api-access-5dgmf" (OuterVolumeSpecName: "kube-api-access-5dgmf") pod "7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" (UID: "7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b"). InnerVolumeSpecName "kube-api-access-5dgmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.386816 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" (UID: "7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.416733 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-catalog-content\") pod \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.416796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dz4c\" (UniqueName: \"kubernetes.io/projected/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-kube-api-access-8dz4c\") pod \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417018 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-operator-metrics\") pod \"385ca6a0-940d-409a-a0aa-b22ab8920177\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417321 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-trusted-ca\") pod \"385ca6a0-940d-409a-a0aa-b22ab8920177\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417353 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmgz\" (UniqueName: \"kubernetes.io/projected/385ca6a0-940d-409a-a0aa-b22ab8920177-kube-api-access-6vmgz\") pod \"385ca6a0-940d-409a-a0aa-b22ab8920177\" (UID: \"385ca6a0-940d-409a-a0aa-b22ab8920177\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417389 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-utilities\") pod \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\" (UID: \"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417413 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-utilities\") pod \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417454 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r6p5\" (UniqueName: \"kubernetes.io/projected/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-kube-api-access-9r6p5\") pod \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417487 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-catalog-content\") pod \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\" (UID: \"bb93f0e3-e3b6-41bc-af18-91e3107fc79a\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417506 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-utilities\") pod \"e159da09-2a9f-4472-acca-abe0193feb9f\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417671 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-catalog-content\") pod \"e159da09-2a9f-4472-acca-abe0193feb9f\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.417751 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7wj5\" (UniqueName: \"kubernetes.io/projected/e159da09-2a9f-4472-acca-abe0193feb9f-kube-api-access-p7wj5\") pod \"e159da09-2a9f-4472-acca-abe0193feb9f\" (UID: \"e159da09-2a9f-4472-acca-abe0193feb9f\") " Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.418486 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dgmf\" (UniqueName: \"kubernetes.io/projected/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-kube-api-access-5dgmf\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.418504 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.418514 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.418375 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-utilities" (OuterVolumeSpecName: "utilities") pod "dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" (UID: "dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.418778 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-utilities" (OuterVolumeSpecName: "utilities") pod "bb93f0e3-e3b6-41bc-af18-91e3107fc79a" (UID: "bb93f0e3-e3b6-41bc-af18-91e3107fc79a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.419114 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-utilities" (OuterVolumeSpecName: "utilities") pod "e159da09-2a9f-4472-acca-abe0193feb9f" (UID: "e159da09-2a9f-4472-acca-abe0193feb9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.420604 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "385ca6a0-940d-409a-a0aa-b22ab8920177" (UID: "385ca6a0-940d-409a-a0aa-b22ab8920177"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.421962 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-kube-api-access-8dz4c" (OuterVolumeSpecName: "kube-api-access-8dz4c") pod "dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" (UID: "dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122"). InnerVolumeSpecName "kube-api-access-8dz4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.422492 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "385ca6a0-940d-409a-a0aa-b22ab8920177" (UID: "385ca6a0-940d-409a-a0aa-b22ab8920177"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.422940 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-kube-api-access-9r6p5" (OuterVolumeSpecName: "kube-api-access-9r6p5") pod "bb93f0e3-e3b6-41bc-af18-91e3107fc79a" (UID: "bb93f0e3-e3b6-41bc-af18-91e3107fc79a"). InnerVolumeSpecName "kube-api-access-9r6p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.424186 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e159da09-2a9f-4472-acca-abe0193feb9f-kube-api-access-p7wj5" (OuterVolumeSpecName: "kube-api-access-p7wj5") pod "e159da09-2a9f-4472-acca-abe0193feb9f" (UID: "e159da09-2a9f-4472-acca-abe0193feb9f"). InnerVolumeSpecName "kube-api-access-p7wj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.431528 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" (UID: "dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.432877 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/385ca6a0-940d-409a-a0aa-b22ab8920177-kube-api-access-6vmgz" (OuterVolumeSpecName: "kube-api-access-6vmgz") pod "385ca6a0-940d-409a-a0aa-b22ab8920177" (UID: "385ca6a0-940d-409a-a0aa-b22ab8920177"). InnerVolumeSpecName "kube-api-access-6vmgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.477821 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e159da09-2a9f-4472-acca-abe0193feb9f" (UID: "e159da09-2a9f-4472-acca-abe0193feb9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.514464 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb93f0e3-e3b6-41bc-af18-91e3107fc79a" (UID: "bb93f0e3-e3b6-41bc-af18-91e3107fc79a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519736 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519773 4778 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/385ca6a0-940d-409a-a0aa-b22ab8920177-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519785 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmgz\" (UniqueName: \"kubernetes.io/projected/385ca6a0-940d-409a-a0aa-b22ab8920177-kube-api-access-6vmgz\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519797 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519806 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519814 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r6p5\" (UniqueName: \"kubernetes.io/projected/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-kube-api-access-9r6p5\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519822 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb93f0e3-e3b6-41bc-af18-91e3107fc79a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519830 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519838 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e159da09-2a9f-4472-acca-abe0193feb9f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519846 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7wj5\" (UniqueName: \"kubernetes.io/projected/e159da09-2a9f-4472-acca-abe0193feb9f-kube-api-access-p7wj5\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519855 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.519864 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dz4c\" (UniqueName: \"kubernetes.io/projected/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122-kube-api-access-8dz4c\") on node \"crc\" DevicePath \"\"" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.589756 4778 generic.go:334] "Generic (PLEG): container finished" podID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerID="0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3" exitCode=0 Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.589831 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" event={"ID":"385ca6a0-940d-409a-a0aa-b22ab8920177","Type":"ContainerDied","Data":"0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.589866 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.589908 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-bjv8b" event={"ID":"385ca6a0-940d-409a-a0aa-b22ab8920177","Type":"ContainerDied","Data":"db5dff8fc4b4e3c8e77b18a638ba0d5c88666f3faec6c3fac9fda17c68c7c412"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.589980 4778 scope.go:117] "RemoveContainer" containerID="0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.593506 4778 generic.go:334] "Generic (PLEG): container finished" podID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerID="8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55" exitCode=0 Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.593564 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ls2gx" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.593591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerDied","Data":"8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.593646 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ls2gx" event={"ID":"dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122","Type":"ContainerDied","Data":"8e784aba9551465db2204ff462b724cb5363f99e414cf3b019f3d300be025772"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.599076 4778 generic.go:334] "Generic (PLEG): container finished" podID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerID="075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1" exitCode=0 Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.599150 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerDied","Data":"075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.599181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wfx99" event={"ID":"7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b","Type":"ContainerDied","Data":"ff5e0e93ad3eac76d9b1585771c1a254b05cc29027001a9a3187dd82f0af6ff7"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.599256 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wfx99" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.603427 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerID="a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81" exitCode=0 Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.603493 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerDied","Data":"a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.603519 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hpfsj" event={"ID":"bb93f0e3-e3b6-41bc-af18-91e3107fc79a","Type":"ContainerDied","Data":"4f4a68faff9d0a98c2db27d0a1a9ad05c99829285659266896535e951782bbde"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.603583 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hpfsj" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.606253 4778 generic.go:334] "Generic (PLEG): container finished" podID="e159da09-2a9f-4472-acca-abe0193feb9f" containerID="1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587" exitCode=0 Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.606334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerDied","Data":"1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.606384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcbgk" event={"ID":"e159da09-2a9f-4472-acca-abe0193feb9f","Type":"ContainerDied","Data":"9cb1372030befbca7b84f8ec04c93d6501310fe27f7437275c985ab7013b43f9"} Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.606532 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcbgk" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.628414 4778 scope.go:117] "RemoveContainer" containerID="0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.629228 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3\": container with ID starting with 0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3 not found: ID does not exist" containerID="0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.629284 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3"} err="failed to get container status \"0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3\": rpc error: code = NotFound desc = could not find container \"0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3\": container with ID starting with 0b4181a02a2d03d2c366adbb69cccaec7ef34b958bdf7fdcc4aac093b39b59f3 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.629322 4778 scope.go:117] "RemoveContainer" containerID="8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.629516 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bjv8b"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.636956 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-bjv8b"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.649764 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls2gx"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.675586 4778 scope.go:117] "RemoveContainer" containerID="d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.681295 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ls2gx"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.689217 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wfx99"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.698692 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wfx99"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.701525 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-9nzgv"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.706586 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcbgk"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.711026 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcbgk"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.711304 4778 scope.go:117] "RemoveContainer" containerID="2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.744189 4778 scope.go:117] "RemoveContainer" containerID="8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.746574 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55\": container with ID starting with 8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55 not found: ID does not exist" containerID="8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.746650 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55"} err="failed to get container status \"8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55\": rpc error: code = NotFound desc = could not find container \"8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55\": container with ID starting with 8f6945349f62c2ff203027a41a14381f3f1b3a2601efe66ef14ba797568acc55 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.746686 4778 scope.go:117] "RemoveContainer" containerID="d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.747267 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" path="/var/lib/kubelet/pods/385ca6a0-940d-409a-a0aa-b22ab8920177/volumes" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.747699 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507\": container with ID starting with d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507 not found: ID does not exist" containerID="d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.747729 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507"} err="failed to get container status \"d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507\": rpc error: code = NotFound desc = could not find container \"d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507\": container with ID starting with d72ce827f3380cd1e67de7e0a3338baa92cceee75bf88f480429e96a397b6507 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.747742 4778 scope.go:117] "RemoveContainer" containerID="2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.748970 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" path="/var/lib/kubelet/pods/7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b/volumes" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.750920 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14\": container with ID starting with 2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14 not found: ID does not exist" containerID="2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.751000 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14"} err="failed to get container status \"2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14\": rpc error: code = NotFound desc = could not find container \"2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14\": container with ID starting with 2a929fef47071100a30f4360570e2db169819f6fe3bc44e2effc487d7bc9df14 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.751047 4778 scope.go:117] "RemoveContainer" containerID="075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.751771 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" path="/var/lib/kubelet/pods/dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122/volumes" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.763601 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" path="/var/lib/kubelet/pods/e159da09-2a9f-4472-acca-abe0193feb9f/volumes" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.766479 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hpfsj"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.766766 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hpfsj"] Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.790359 4778 scope.go:117] "RemoveContainer" containerID="3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.825050 4778 scope.go:117] "RemoveContainer" containerID="87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.874799 4778 scope.go:117] "RemoveContainer" containerID="075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.875557 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1\": container with ID starting with 075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1 not found: ID does not exist" containerID="075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.875589 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1"} err="failed to get container status \"075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1\": rpc error: code = NotFound desc = could not find container \"075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1\": container with ID starting with 075f0a930271f6f1725ada870056b64b15b2abddfb8ef10a93e33e0e913478c1 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.875662 4778 scope.go:117] "RemoveContainer" containerID="3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.876202 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc\": container with ID starting with 3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc not found: ID does not exist" containerID="3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.876226 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc"} err="failed to get container status \"3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc\": rpc error: code = NotFound desc = could not find container \"3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc\": container with ID starting with 3a172f7e2f1298d5a65a4b64254d45cec4d9e34b74d3174b2aa95f285b7874dc not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.876241 4778 scope.go:117] "RemoveContainer" containerID="87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.876832 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd\": container with ID starting with 87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd not found: ID does not exist" containerID="87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.876904 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd"} err="failed to get container status \"87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd\": rpc error: code = NotFound desc = could not find container \"87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd\": container with ID starting with 87c6c540df34133a8f6eea3f490f991b287f6a7da12c023e4615acb0eebddffd not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.876964 4778 scope.go:117] "RemoveContainer" containerID="a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.899570 4778 scope.go:117] "RemoveContainer" containerID="b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.932187 4778 scope.go:117] "RemoveContainer" containerID="fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.954907 4778 scope.go:117] "RemoveContainer" containerID="a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.955516 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81\": container with ID starting with a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81 not found: ID does not exist" containerID="a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.955545 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81"} err="failed to get container status \"a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81\": rpc error: code = NotFound desc = could not find container \"a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81\": container with ID starting with a3ae404c6576ae7345ea0e24a55339b72feb6064cdc7e08fdd8f1455c92ffb81 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.955571 4778 scope.go:117] "RemoveContainer" containerID="b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.955917 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3\": container with ID starting with b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3 not found: ID does not exist" containerID="b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.956054 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3"} err="failed to get container status \"b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3\": rpc error: code = NotFound desc = could not find container \"b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3\": container with ID starting with b535aecc0f121e30e87d6f0c91396c35344a7ad6b134782e4c381482346004e3 not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.956138 4778 scope.go:117] "RemoveContainer" containerID="fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe" Sep 30 17:22:17 crc kubenswrapper[4778]: E0930 17:22:17.956554 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe\": container with ID starting with fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe not found: ID does not exist" containerID="fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.956578 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe"} err="failed to get container status \"fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe\": rpc error: code = NotFound desc = could not find container \"fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe\": container with ID starting with fde1cfb85200da591799a72596e32e816c99c594f22786f7641254e071af1ebe not found: ID does not exist" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.956595 4778 scope.go:117] "RemoveContainer" containerID="1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.973139 4778 scope.go:117] "RemoveContainer" containerID="332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371" Sep 30 17:22:17 crc kubenswrapper[4778]: I0930 17:22:17.991302 4778 scope.go:117] "RemoveContainer" containerID="0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.008822 4778 scope.go:117] "RemoveContainer" containerID="1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.009497 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587\": container with ID starting with 1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587 not found: ID does not exist" containerID="1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.009541 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587"} err="failed to get container status \"1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587\": rpc error: code = NotFound desc = could not find container \"1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587\": container with ID starting with 1a3aad032acc08231722b83fab0cc1a037db104a1d6f2cde1f7fbb262c488587 not found: ID does not exist" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.009573 4778 scope.go:117] "RemoveContainer" containerID="332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.010284 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371\": container with ID starting with 332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371 not found: ID does not exist" containerID="332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.010330 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371"} err="failed to get container status \"332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371\": rpc error: code = NotFound desc = could not find container \"332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371\": container with ID starting with 332fb01497fcfb524aa487251d48fb3da9cf28d1ebac5e41a0836377ddbfd371 not found: ID does not exist" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.010362 4778 scope.go:117] "RemoveContainer" containerID="0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.011037 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0\": container with ID starting with 0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0 not found: ID does not exist" containerID="0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.011099 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0"} err="failed to get container status \"0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0\": rpc error: code = NotFound desc = could not find container \"0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0\": container with ID starting with 0fae513d24c93a52b45513974ac4d6e4b28f9f725c72440dc48b20efe27bc1f0 not found: ID does not exist" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160216 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6ls2z"] Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160451 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160465 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160479 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160488 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160498 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160504 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160512 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160519 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160526 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160532 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160540 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160546 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160557 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160562 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160570 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160575 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160585 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160590 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="extract-content" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160597 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160603 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160628 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160633 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160640 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160646 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" Sep 30 17:22:18 crc kubenswrapper[4778]: E0930 17:22:18.160655 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160662 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="extract-utilities" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160751 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="385ca6a0-940d-409a-a0aa-b22ab8920177" containerName="marketplace-operator" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160760 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fabd2e3-f24a-411f-a8a6-ce455ddd6d9b" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160767 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160775 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e159da09-2a9f-4472-acca-abe0193feb9f" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.160783 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="dde7c6f6-ede6-4ae9-9fb2-b07f8ccb1122" containerName="registry-server" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.161463 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.165629 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.177943 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ls2z"] Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.334855 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b10c0547-5d08-46e5-bea1-53e2129b7f3a-utilities\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.334986 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b10c0547-5d08-46e5-bea1-53e2129b7f3a-catalog-content\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.335057 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z946\" (UniqueName: \"kubernetes.io/projected/b10c0547-5d08-46e5-bea1-53e2129b7f3a-kube-api-access-9z946\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.436148 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b10c0547-5d08-46e5-bea1-53e2129b7f3a-utilities\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.436764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b10c0547-5d08-46e5-bea1-53e2129b7f3a-catalog-content\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.436798 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z946\" (UniqueName: \"kubernetes.io/projected/b10c0547-5d08-46e5-bea1-53e2129b7f3a-kube-api-access-9z946\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.436851 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b10c0547-5d08-46e5-bea1-53e2129b7f3a-utilities\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.437505 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b10c0547-5d08-46e5-bea1-53e2129b7f3a-catalog-content\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.467987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z946\" (UniqueName: \"kubernetes.io/projected/b10c0547-5d08-46e5-bea1-53e2129b7f3a-kube-api-access-9z946\") pod \"community-operators-6ls2z\" (UID: \"b10c0547-5d08-46e5-bea1-53e2129b7f3a\") " pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.478109 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.616023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" event={"ID":"d46eace9-9047-4db1-acb9-1a588ab49434","Type":"ContainerStarted","Data":"544682184fe71f06a88ed221de7acbef78108d0b3568a673731238fcb5dc812a"} Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.616064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" event={"ID":"d46eace9-9047-4db1-acb9-1a588ab49434","Type":"ContainerStarted","Data":"3917f9191c969d95401e9fcb5025243c61c774c0d0109f5027b1ada3a2143ea6"} Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.616395 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.626972 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" Sep 30 17:22:18 crc kubenswrapper[4778]: I0930 17:22:18.638509 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-9nzgv" podStartSLOduration=2.638485147 podStartE2EDuration="2.638485147s" podCreationTimestamp="2025-09-30 17:22:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:22:18.634986561 +0000 UTC m=+277.624884374" watchObservedRunningTime="2025-09-30 17:22:18.638485147 +0000 UTC m=+277.628382950" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.134666 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ls2z"] Sep 30 17:22:19 crc kubenswrapper[4778]: W0930 17:22:19.147960 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb10c0547_5d08_46e5_bea1_53e2129b7f3a.slice/crio-082e021ac4c8cbda7fcf57ae592b4321882d1119c06197daae089feaf3032de8 WatchSource:0}: Error finding container 082e021ac4c8cbda7fcf57ae592b4321882d1119c06197daae089feaf3032de8: Status 404 returned error can't find the container with id 082e021ac4c8cbda7fcf57ae592b4321882d1119c06197daae089feaf3032de8 Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.566181 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9gtp4"] Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.568753 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.571661 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gtp4"] Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.572106 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.640479 4778 generic.go:334] "Generic (PLEG): container finished" podID="b10c0547-5d08-46e5-bea1-53e2129b7f3a" containerID="1af1828a0f64167b15e2c38212f9a993df3b4b690bddd044966c718e264a12f0" exitCode=0 Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.640528 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ls2z" event={"ID":"b10c0547-5d08-46e5-bea1-53e2129b7f3a","Type":"ContainerDied","Data":"1af1828a0f64167b15e2c38212f9a993df3b4b690bddd044966c718e264a12f0"} Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.640572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ls2z" event={"ID":"b10c0547-5d08-46e5-bea1-53e2129b7f3a","Type":"ContainerStarted","Data":"082e021ac4c8cbda7fcf57ae592b4321882d1119c06197daae089feaf3032de8"} Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.654286 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xz5l\" (UniqueName: \"kubernetes.io/projected/23443d75-478c-4f20-b3fd-3eceb05a37d9-kube-api-access-5xz5l\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.654350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23443d75-478c-4f20-b3fd-3eceb05a37d9-catalog-content\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.654507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23443d75-478c-4f20-b3fd-3eceb05a37d9-utilities\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.721167 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb93f0e3-e3b6-41bc-af18-91e3107fc79a" path="/var/lib/kubelet/pods/bb93f0e3-e3b6-41bc-af18-91e3107fc79a/volumes" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.755599 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xz5l\" (UniqueName: \"kubernetes.io/projected/23443d75-478c-4f20-b3fd-3eceb05a37d9-kube-api-access-5xz5l\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.755690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23443d75-478c-4f20-b3fd-3eceb05a37d9-catalog-content\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.755740 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23443d75-478c-4f20-b3fd-3eceb05a37d9-utilities\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.756539 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23443d75-478c-4f20-b3fd-3eceb05a37d9-utilities\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.756890 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23443d75-478c-4f20-b3fd-3eceb05a37d9-catalog-content\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.780022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xz5l\" (UniqueName: \"kubernetes.io/projected/23443d75-478c-4f20-b3fd-3eceb05a37d9-kube-api-access-5xz5l\") pod \"redhat-marketplace-9gtp4\" (UID: \"23443d75-478c-4f20-b3fd-3eceb05a37d9\") " pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:19 crc kubenswrapper[4778]: I0930 17:22:19.889453 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.297608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gtp4"] Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.563338 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xsmps"] Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.564274 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.568863 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.591698 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xsmps"] Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.648310 4778 generic.go:334] "Generic (PLEG): container finished" podID="23443d75-478c-4f20-b3fd-3eceb05a37d9" containerID="80fe509e54d1fd31a415e232083805e66dd61d198b7497465c15bc31c21e0a62" exitCode=0 Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.648404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gtp4" event={"ID":"23443d75-478c-4f20-b3fd-3eceb05a37d9","Type":"ContainerDied","Data":"80fe509e54d1fd31a415e232083805e66dd61d198b7497465c15bc31c21e0a62"} Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.648531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gtp4" event={"ID":"23443d75-478c-4f20-b3fd-3eceb05a37d9","Type":"ContainerStarted","Data":"958c4cee4f71dda346b005c2d02c66e635e3cce33b16ae044f48f367df1ce828"} Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.655245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ls2z" event={"ID":"b10c0547-5d08-46e5-bea1-53e2129b7f3a","Type":"ContainerStarted","Data":"e45f19056b264d3d559b63f18fb9ff715416be876e9f0e8c439afac3a30a69b7"} Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.669011 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8227d8db-e2f9-44dc-a41a-efb4088be2fa-catalog-content\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.669044 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpn9r\" (UniqueName: \"kubernetes.io/projected/8227d8db-e2f9-44dc-a41a-efb4088be2fa-kube-api-access-jpn9r\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.669082 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8227d8db-e2f9-44dc-a41a-efb4088be2fa-utilities\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.770453 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8227d8db-e2f9-44dc-a41a-efb4088be2fa-catalog-content\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.770530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpn9r\" (UniqueName: \"kubernetes.io/projected/8227d8db-e2f9-44dc-a41a-efb4088be2fa-kube-api-access-jpn9r\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.770646 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8227d8db-e2f9-44dc-a41a-efb4088be2fa-utilities\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.771870 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8227d8db-e2f9-44dc-a41a-efb4088be2fa-catalog-content\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.771957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8227d8db-e2f9-44dc-a41a-efb4088be2fa-utilities\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.788607 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpn9r\" (UniqueName: \"kubernetes.io/projected/8227d8db-e2f9-44dc-a41a-efb4088be2fa-kube-api-access-jpn9r\") pod \"redhat-operators-xsmps\" (UID: \"8227d8db-e2f9-44dc-a41a-efb4088be2fa\") " pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:20 crc kubenswrapper[4778]: I0930 17:22:20.908375 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.103721 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xsmps"] Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.663337 4778 generic.go:334] "Generic (PLEG): container finished" podID="b10c0547-5d08-46e5-bea1-53e2129b7f3a" containerID="e45f19056b264d3d559b63f18fb9ff715416be876e9f0e8c439afac3a30a69b7" exitCode=0 Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.663459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ls2z" event={"ID":"b10c0547-5d08-46e5-bea1-53e2129b7f3a","Type":"ContainerDied","Data":"e45f19056b264d3d559b63f18fb9ff715416be876e9f0e8c439afac3a30a69b7"} Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.666431 4778 generic.go:334] "Generic (PLEG): container finished" podID="8227d8db-e2f9-44dc-a41a-efb4088be2fa" containerID="cc439b05e6edeacb7de689c4493b90f4b1c213af89c1c9ac8bfd3ec920b1019b" exitCode=0 Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.666526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsmps" event={"ID":"8227d8db-e2f9-44dc-a41a-efb4088be2fa","Type":"ContainerDied","Data":"cc439b05e6edeacb7de689c4493b90f4b1c213af89c1c9ac8bfd3ec920b1019b"} Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.666580 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsmps" event={"ID":"8227d8db-e2f9-44dc-a41a-efb4088be2fa","Type":"ContainerStarted","Data":"4a8262787054e6e9956db9a5714c5a8e033aa752ab890ac811e367d422e10090"} Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.969098 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lx7lx"] Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.970788 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.973117 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lx7lx"] Sep 30 17:22:21 crc kubenswrapper[4778]: I0930 17:22:21.974199 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.090395 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2dbd49-d11a-4af8-8241-89981ad46467-utilities\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.090744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgbk4\" (UniqueName: \"kubernetes.io/projected/2f2dbd49-d11a-4af8-8241-89981ad46467-kube-api-access-vgbk4\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.090899 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2dbd49-d11a-4af8-8241-89981ad46467-catalog-content\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.192484 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgbk4\" (UniqueName: \"kubernetes.io/projected/2f2dbd49-d11a-4af8-8241-89981ad46467-kube-api-access-vgbk4\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.192960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2dbd49-d11a-4af8-8241-89981ad46467-catalog-content\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.193000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2dbd49-d11a-4af8-8241-89981ad46467-utilities\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.193474 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f2dbd49-d11a-4af8-8241-89981ad46467-catalog-content\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.194863 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f2dbd49-d11a-4af8-8241-89981ad46467-utilities\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.212406 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgbk4\" (UniqueName: \"kubernetes.io/projected/2f2dbd49-d11a-4af8-8241-89981ad46467-kube-api-access-vgbk4\") pod \"certified-operators-lx7lx\" (UID: \"2f2dbd49-d11a-4af8-8241-89981ad46467\") " pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.298391 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.674976 4778 generic.go:334] "Generic (PLEG): container finished" podID="23443d75-478c-4f20-b3fd-3eceb05a37d9" containerID="8725de955b2ad2e89f8c72e4cb3015eb23a4ec0d1aab6c4e515297754ff93599" exitCode=0 Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.675100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gtp4" event={"ID":"23443d75-478c-4f20-b3fd-3eceb05a37d9","Type":"ContainerDied","Data":"8725de955b2ad2e89f8c72e4cb3015eb23a4ec0d1aab6c4e515297754ff93599"} Sep 30 17:22:22 crc kubenswrapper[4778]: I0930 17:22:22.702844 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lx7lx"] Sep 30 17:22:22 crc kubenswrapper[4778]: W0930 17:22:22.704815 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f2dbd49_d11a_4af8_8241_89981ad46467.slice/crio-2ecff6e22fa9b578b8cde1be07377a20d172edd86d6c3c126087665430aba375 WatchSource:0}: Error finding container 2ecff6e22fa9b578b8cde1be07377a20d172edd86d6c3c126087665430aba375: Status 404 returned error can't find the container with id 2ecff6e22fa9b578b8cde1be07377a20d172edd86d6c3c126087665430aba375 Sep 30 17:22:23 crc kubenswrapper[4778]: I0930 17:22:23.690090 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ls2z" event={"ID":"b10c0547-5d08-46e5-bea1-53e2129b7f3a","Type":"ContainerStarted","Data":"282dd86ad4160acaa60ab84eb66f242b6dcd8aae3d5fbef872b0c04e59515303"} Sep 30 17:22:23 crc kubenswrapper[4778]: I0930 17:22:23.693353 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f2dbd49-d11a-4af8-8241-89981ad46467" containerID="6fe3287b74429378a7b353a0494f9ec4b70289eab3f66d65535ee04a3ae71066" exitCode=0 Sep 30 17:22:23 crc kubenswrapper[4778]: I0930 17:22:23.693510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lx7lx" event={"ID":"2f2dbd49-d11a-4af8-8241-89981ad46467","Type":"ContainerDied","Data":"6fe3287b74429378a7b353a0494f9ec4b70289eab3f66d65535ee04a3ae71066"} Sep 30 17:22:23 crc kubenswrapper[4778]: I0930 17:22:23.693599 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lx7lx" event={"ID":"2f2dbd49-d11a-4af8-8241-89981ad46467","Type":"ContainerStarted","Data":"2ecff6e22fa9b578b8cde1be07377a20d172edd86d6c3c126087665430aba375"} Sep 30 17:22:23 crc kubenswrapper[4778]: I0930 17:22:23.713650 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6ls2z" podStartSLOduration=2.414297614 podStartE2EDuration="5.713604032s" podCreationTimestamp="2025-09-30 17:22:18 +0000 UTC" firstStartedPulling="2025-09-30 17:22:19.643187597 +0000 UTC m=+278.633085400" lastFinishedPulling="2025-09-30 17:22:22.942494025 +0000 UTC m=+281.932391818" observedRunningTime="2025-09-30 17:22:23.711214716 +0000 UTC m=+282.701112519" watchObservedRunningTime="2025-09-30 17:22:23.713604032 +0000 UTC m=+282.703501825" Sep 30 17:22:24 crc kubenswrapper[4778]: I0930 17:22:24.711965 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gtp4" event={"ID":"23443d75-478c-4f20-b3fd-3eceb05a37d9","Type":"ContainerStarted","Data":"b91e0a9dd7424defd7f46427550dcdb1ff7beeaec1526fe5ef18f3b7aadc57c4"} Sep 30 17:22:24 crc kubenswrapper[4778]: I0930 17:22:24.715102 4778 generic.go:334] "Generic (PLEG): container finished" podID="8227d8db-e2f9-44dc-a41a-efb4088be2fa" containerID="25bd7b42f8cb1a0268938fdb4d4b50b42e309750784da1920612ecf90ebee48b" exitCode=0 Sep 30 17:22:24 crc kubenswrapper[4778]: I0930 17:22:24.716447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsmps" event={"ID":"8227d8db-e2f9-44dc-a41a-efb4088be2fa","Type":"ContainerDied","Data":"25bd7b42f8cb1a0268938fdb4d4b50b42e309750784da1920612ecf90ebee48b"} Sep 30 17:22:24 crc kubenswrapper[4778]: I0930 17:22:24.754488 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9gtp4" podStartSLOduration=2.384129031 podStartE2EDuration="5.75445415s" podCreationTimestamp="2025-09-30 17:22:19 +0000 UTC" firstStartedPulling="2025-09-30 17:22:20.649567704 +0000 UTC m=+279.639465507" lastFinishedPulling="2025-09-30 17:22:24.019892823 +0000 UTC m=+283.009790626" observedRunningTime="2025-09-30 17:22:24.733786385 +0000 UTC m=+283.723684198" watchObservedRunningTime="2025-09-30 17:22:24.75445415 +0000 UTC m=+283.744351953" Sep 30 17:22:25 crc kubenswrapper[4778]: I0930 17:22:25.722514 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xsmps" event={"ID":"8227d8db-e2f9-44dc-a41a-efb4088be2fa","Type":"ContainerStarted","Data":"7a4a5aeaf7faf437e2c52261e95b0dc3ac76a710ef73cbd511a7844caa7df1d6"} Sep 30 17:22:25 crc kubenswrapper[4778]: I0930 17:22:25.724515 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lx7lx" event={"ID":"2f2dbd49-d11a-4af8-8241-89981ad46467","Type":"ContainerStarted","Data":"3fbb29f5825ed4a0224639737ef84729af2536a21f42569fcd217843e352c32e"} Sep 30 17:22:25 crc kubenswrapper[4778]: I0930 17:22:25.742464 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xsmps" podStartSLOduration=2.185086963 podStartE2EDuration="5.742447904s" podCreationTimestamp="2025-09-30 17:22:20 +0000 UTC" firstStartedPulling="2025-09-30 17:22:21.687058511 +0000 UTC m=+280.676956314" lastFinishedPulling="2025-09-30 17:22:25.244419452 +0000 UTC m=+284.234317255" observedRunningTime="2025-09-30 17:22:25.741371115 +0000 UTC m=+284.731268918" watchObservedRunningTime="2025-09-30 17:22:25.742447904 +0000 UTC m=+284.732345707" Sep 30 17:22:26 crc kubenswrapper[4778]: I0930 17:22:26.731880 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f2dbd49-d11a-4af8-8241-89981ad46467" containerID="3fbb29f5825ed4a0224639737ef84729af2536a21f42569fcd217843e352c32e" exitCode=0 Sep 30 17:22:26 crc kubenswrapper[4778]: I0930 17:22:26.731967 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lx7lx" event={"ID":"2f2dbd49-d11a-4af8-8241-89981ad46467","Type":"ContainerDied","Data":"3fbb29f5825ed4a0224639737ef84729af2536a21f42569fcd217843e352c32e"} Sep 30 17:22:28 crc kubenswrapper[4778]: I0930 17:22:28.478504 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:28 crc kubenswrapper[4778]: I0930 17:22:28.478817 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:28 crc kubenswrapper[4778]: I0930 17:22:28.517565 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:28 crc kubenswrapper[4778]: I0930 17:22:28.746658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lx7lx" event={"ID":"2f2dbd49-d11a-4af8-8241-89981ad46467","Type":"ContainerStarted","Data":"e40d88108dc415bff27dc34a2cad7ef51a951c04a23bac076717cc5e050166d7"} Sep 30 17:22:28 crc kubenswrapper[4778]: I0930 17:22:28.769544 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lx7lx" podStartSLOduration=3.730039852 podStartE2EDuration="7.769509031s" podCreationTimestamp="2025-09-30 17:22:21 +0000 UTC" firstStartedPulling="2025-09-30 17:22:23.69895092 +0000 UTC m=+282.688848713" lastFinishedPulling="2025-09-30 17:22:27.738420089 +0000 UTC m=+286.728317892" observedRunningTime="2025-09-30 17:22:28.764277828 +0000 UTC m=+287.754175661" watchObservedRunningTime="2025-09-30 17:22:28.769509031 +0000 UTC m=+287.759406834" Sep 30 17:22:28 crc kubenswrapper[4778]: I0930 17:22:28.786478 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6ls2z" Sep 30 17:22:29 crc kubenswrapper[4778]: I0930 17:22:29.890254 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:29 crc kubenswrapper[4778]: I0930 17:22:29.890358 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:29 crc kubenswrapper[4778]: I0930 17:22:29.934943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:30 crc kubenswrapper[4778]: I0930 17:22:30.797943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9gtp4" Sep 30 17:22:30 crc kubenswrapper[4778]: I0930 17:22:30.908684 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:30 crc kubenswrapper[4778]: I0930 17:22:30.908740 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:30 crc kubenswrapper[4778]: I0930 17:22:30.950908 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:31 crc kubenswrapper[4778]: I0930 17:22:31.821930 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xsmps" Sep 30 17:22:32 crc kubenswrapper[4778]: I0930 17:22:32.299359 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:32 crc kubenswrapper[4778]: I0930 17:22:32.299700 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:32 crc kubenswrapper[4778]: I0930 17:22:32.368784 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:22:32 crc kubenswrapper[4778]: I0930 17:22:32.808272 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lx7lx" Sep 30 17:23:44 crc kubenswrapper[4778]: I0930 17:23:44.812101 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:23:44 crc kubenswrapper[4778]: I0930 17:23:44.813172 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:24:14 crc kubenswrapper[4778]: I0930 17:24:14.812534 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:24:14 crc kubenswrapper[4778]: I0930 17:24:14.814368 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:24:44 crc kubenswrapper[4778]: I0930 17:24:44.812161 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:24:44 crc kubenswrapper[4778]: I0930 17:24:44.812929 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:24:44 crc kubenswrapper[4778]: I0930 17:24:44.812986 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:24:44 crc kubenswrapper[4778]: I0930 17:24:44.813634 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13a00401a21c43e81f99eb39c3b8e765d2e6610548614249a56f21e67091808b"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:24:44 crc kubenswrapper[4778]: I0930 17:24:44.813697 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://13a00401a21c43e81f99eb39c3b8e765d2e6610548614249a56f21e67091808b" gracePeriod=600 Sep 30 17:24:45 crc kubenswrapper[4778]: I0930 17:24:45.541179 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="13a00401a21c43e81f99eb39c3b8e765d2e6610548614249a56f21e67091808b" exitCode=0 Sep 30 17:24:45 crc kubenswrapper[4778]: I0930 17:24:45.541271 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"13a00401a21c43e81f99eb39c3b8e765d2e6610548614249a56f21e67091808b"} Sep 30 17:24:45 crc kubenswrapper[4778]: I0930 17:24:45.542158 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"87ec9413bfb27167167aeb00914e82262471cb92624b3bf11492e1b4663098b9"} Sep 30 17:24:45 crc kubenswrapper[4778]: I0930 17:24:45.542233 4778 scope.go:117] "RemoveContainer" containerID="abaa7ae8b9dadee26b8e5d0171fd989733ff34adb2ff8ffd8fce19af2e842505" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.115375 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfnth"] Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.116865 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.136593 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfnth"] Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213515 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6bfece6-9b64-4491-a686-38fcd405ff84-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6bfece6-9b64-4491-a686-38fcd405ff84-trusted-ca\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213681 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213713 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr78n\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-kube-api-access-zr78n\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213777 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6bfece6-9b64-4491-a686-38fcd405ff84-registry-certificates\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213800 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6bfece6-9b64-4491-a686-38fcd405ff84-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213825 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-bound-sa-token\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.213849 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-registry-tls\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.240816 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.314587 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6bfece6-9b64-4491-a686-38fcd405ff84-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.314930 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6bfece6-9b64-4491-a686-38fcd405ff84-trusted-ca\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.315060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr78n\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-kube-api-access-zr78n\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.315122 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d6bfece6-9b64-4491-a686-38fcd405ff84-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.315229 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6bfece6-9b64-4491-a686-38fcd405ff84-registry-certificates\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.315402 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6bfece6-9b64-4491-a686-38fcd405ff84-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.315510 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-bound-sa-token\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.315642 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-registry-tls\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.316522 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d6bfece6-9b64-4491-a686-38fcd405ff84-registry-certificates\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.316566 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d6bfece6-9b64-4491-a686-38fcd405ff84-trusted-ca\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.322691 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d6bfece6-9b64-4491-a686-38fcd405ff84-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.323138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-registry-tls\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.332245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr78n\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-kube-api-access-zr78n\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.333714 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d6bfece6-9b64-4491-a686-38fcd405ff84-bound-sa-token\") pod \"image-registry-66df7c8f76-hfnth\" (UID: \"d6bfece6-9b64-4491-a686-38fcd405ff84\") " pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.476342 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.899808 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hfnth"] Sep 30 17:26:04 crc kubenswrapper[4778]: I0930 17:26:04.991128 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" event={"ID":"d6bfece6-9b64-4491-a686-38fcd405ff84","Type":"ContainerStarted","Data":"73eb71da033255d19b454ae76689ea3ee00164d5f2383d8450382220a5bb3078"} Sep 30 17:26:05 crc kubenswrapper[4778]: I0930 17:26:05.997354 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" event={"ID":"d6bfece6-9b64-4491-a686-38fcd405ff84","Type":"ContainerStarted","Data":"1820d57003bd96646dbc3c95ebafdb74bf54f00a7297a46de869a67d86f63517"} Sep 30 17:26:05 crc kubenswrapper[4778]: I0930 17:26:05.997551 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:06 crc kubenswrapper[4778]: I0930 17:26:06.018072 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" podStartSLOduration=2.018039356 podStartE2EDuration="2.018039356s" podCreationTimestamp="2025-09-30 17:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:26:06.015524675 +0000 UTC m=+505.005422478" watchObservedRunningTime="2025-09-30 17:26:06.018039356 +0000 UTC m=+505.007937169" Sep 30 17:26:24 crc kubenswrapper[4778]: I0930 17:26:24.481492 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hfnth" Sep 30 17:26:24 crc kubenswrapper[4778]: I0930 17:26:24.550677 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dtqsp"] Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.590640 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" podUID="452a4879-b0bd-490e-bffb-25f8404a6eac" containerName="registry" containerID="cri-o://999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741" gracePeriod=30 Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.919995 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.976940 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-bound-sa-token\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.976991 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-trusted-ca\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.977049 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-certificates\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.977068 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/452a4879-b0bd-490e-bffb-25f8404a6eac-ca-trust-extracted\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.977100 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-tls\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.977354 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.977384 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fggq2\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-kube-api-access-fggq2\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.977416 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/452a4879-b0bd-490e-bffb-25f8404a6eac-installation-pull-secrets\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.979378 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.979562 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.985167 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.985645 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.986030 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/452a4879-b0bd-490e-bffb-25f8404a6eac-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.986449 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-kube-api-access-fggq2" (OuterVolumeSpecName: "kube-api-access-fggq2") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "kube-api-access-fggq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:26:49 crc kubenswrapper[4778]: E0930 17:26:49.987130 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:452a4879-b0bd-490e-bffb-25f8404a6eac nodeName:}" failed. No retries permitted until 2025-09-30 17:26:50.487110429 +0000 UTC m=+549.477008232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "registry-storage" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Sep 30 17:26:49 crc kubenswrapper[4778]: I0930 17:26:49.995594 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452a4879-b0bd-490e-bffb-25f8404a6eac-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079125 4778 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079193 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079206 4778 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079221 4778 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/452a4879-b0bd-490e-bffb-25f8404a6eac-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079231 4778 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079242 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fggq2\" (UniqueName: \"kubernetes.io/projected/452a4879-b0bd-490e-bffb-25f8404a6eac-kube-api-access-fggq2\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.079251 4778 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/452a4879-b0bd-490e-bffb-25f8404a6eac-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.240940 4778 generic.go:334] "Generic (PLEG): container finished" podID="452a4879-b0bd-490e-bffb-25f8404a6eac" containerID="999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741" exitCode=0 Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.240976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" event={"ID":"452a4879-b0bd-490e-bffb-25f8404a6eac","Type":"ContainerDied","Data":"999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741"} Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.241008 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.241013 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-dtqsp" event={"ID":"452a4879-b0bd-490e-bffb-25f8404a6eac","Type":"ContainerDied","Data":"4039dddd8e26351159f4a9395e62f5cbecc91b6b066c5e638f58ce408c37e5dc"} Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.241024 4778 scope.go:117] "RemoveContainer" containerID="999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.255893 4778 scope.go:117] "RemoveContainer" containerID="999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741" Sep 30 17:26:50 crc kubenswrapper[4778]: E0930 17:26:50.256441 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741\": container with ID starting with 999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741 not found: ID does not exist" containerID="999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.256476 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741"} err="failed to get container status \"999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741\": rpc error: code = NotFound desc = could not find container \"999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741\": container with ID starting with 999172d57c203d9f1102512f66f8a88c1d21a4cb0ed4beb85999edcc16149741 not found: ID does not exist" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.585202 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"452a4879-b0bd-490e-bffb-25f8404a6eac\" (UID: \"452a4879-b0bd-490e-bffb-25f8404a6eac\") " Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.600237 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "452a4879-b0bd-490e-bffb-25f8404a6eac" (UID: "452a4879-b0bd-490e-bffb-25f8404a6eac"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.673864 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dtqsp"] Sep 30 17:26:50 crc kubenswrapper[4778]: I0930 17:26:50.682050 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-dtqsp"] Sep 30 17:26:51 crc kubenswrapper[4778]: I0930 17:26:51.721757 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452a4879-b0bd-490e-bffb-25f8404a6eac" path="/var/lib/kubelet/pods/452a4879-b0bd-490e-bffb-25f8404a6eac/volumes" Sep 30 17:27:14 crc kubenswrapper[4778]: I0930 17:27:14.812318 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:27:14 crc kubenswrapper[4778]: I0930 17:27:14.814010 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.458858 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nslws"] Sep 30 17:27:25 crc kubenswrapper[4778]: E0930 17:27:25.460025 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452a4879-b0bd-490e-bffb-25f8404a6eac" containerName="registry" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.460048 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="452a4879-b0bd-490e-bffb-25f8404a6eac" containerName="registry" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.460196 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="452a4879-b0bd-490e-bffb-25f8404a6eac" containerName="registry" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.460750 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.468696 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-j7zmv" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.470583 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.470796 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.474755 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nslws"] Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.486084 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-bkzbw"] Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.487100 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-bkzbw" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.488915 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-sgjdm" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.492491 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-bkzbw"] Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.498945 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-vzjnq"] Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.499880 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.504946 4778 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-25lkd" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.514471 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-vzjnq"] Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.606997 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98cnz\" (UniqueName: \"kubernetes.io/projected/a646c0c3-0f15-434d-a414-f1523b29aba5-kube-api-access-98cnz\") pod \"cert-manager-5b446d88c5-bkzbw\" (UID: \"a646c0c3-0f15-434d-a414-f1523b29aba5\") " pod="cert-manager/cert-manager-5b446d88c5-bkzbw" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.607139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7s5\" (UniqueName: \"kubernetes.io/projected/e372bf84-9d99-45cc-9225-7ad37b0c60b8-kube-api-access-fw7s5\") pod \"cert-manager-cainjector-7f985d654d-nslws\" (UID: \"e372bf84-9d99-45cc-9225-7ad37b0c60b8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.607203 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvqj9\" (UniqueName: \"kubernetes.io/projected/0803c4d3-0db6-48be-bc51-a6f24b97ed36-kube-api-access-fvqj9\") pod \"cert-manager-webhook-5655c58dd6-vzjnq\" (UID: \"0803c4d3-0db6-48be-bc51-a6f24b97ed36\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.708110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvqj9\" (UniqueName: \"kubernetes.io/projected/0803c4d3-0db6-48be-bc51-a6f24b97ed36-kube-api-access-fvqj9\") pod \"cert-manager-webhook-5655c58dd6-vzjnq\" (UID: \"0803c4d3-0db6-48be-bc51-a6f24b97ed36\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.708239 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98cnz\" (UniqueName: \"kubernetes.io/projected/a646c0c3-0f15-434d-a414-f1523b29aba5-kube-api-access-98cnz\") pod \"cert-manager-5b446d88c5-bkzbw\" (UID: \"a646c0c3-0f15-434d-a414-f1523b29aba5\") " pod="cert-manager/cert-manager-5b446d88c5-bkzbw" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.708270 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7s5\" (UniqueName: \"kubernetes.io/projected/e372bf84-9d99-45cc-9225-7ad37b0c60b8-kube-api-access-fw7s5\") pod \"cert-manager-cainjector-7f985d654d-nslws\" (UID: \"e372bf84-9d99-45cc-9225-7ad37b0c60b8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.738559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98cnz\" (UniqueName: \"kubernetes.io/projected/a646c0c3-0f15-434d-a414-f1523b29aba5-kube-api-access-98cnz\") pod \"cert-manager-5b446d88c5-bkzbw\" (UID: \"a646c0c3-0f15-434d-a414-f1523b29aba5\") " pod="cert-manager/cert-manager-5b446d88c5-bkzbw" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.745364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvqj9\" (UniqueName: \"kubernetes.io/projected/0803c4d3-0db6-48be-bc51-a6f24b97ed36-kube-api-access-fvqj9\") pod \"cert-manager-webhook-5655c58dd6-vzjnq\" (UID: \"0803c4d3-0db6-48be-bc51-a6f24b97ed36\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.745814 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7s5\" (UniqueName: \"kubernetes.io/projected/e372bf84-9d99-45cc-9225-7ad37b0c60b8-kube-api-access-fw7s5\") pod \"cert-manager-cainjector-7f985d654d-nslws\" (UID: \"e372bf84-9d99-45cc-9225-7ad37b0c60b8\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.779083 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.804859 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-bkzbw" Sep 30 17:27:25 crc kubenswrapper[4778]: I0930 17:27:25.814234 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.055680 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-nslws"] Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.076947 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.090466 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-vzjnq"] Sep 30 17:27:26 crc kubenswrapper[4778]: W0930 17:27:26.102104 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0803c4d3_0db6_48be_bc51_a6f24b97ed36.slice/crio-a3afc7c4b4acfe5db5deb040179d82116c860d47c19bc18371b058af66f5d41b WatchSource:0}: Error finding container a3afc7c4b4acfe5db5deb040179d82116c860d47c19bc18371b058af66f5d41b: Status 404 returned error can't find the container with id a3afc7c4b4acfe5db5deb040179d82116c860d47c19bc18371b058af66f5d41b Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.132510 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-bkzbw"] Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.474527 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-bkzbw" event={"ID":"a646c0c3-0f15-434d-a414-f1523b29aba5","Type":"ContainerStarted","Data":"c10ed1646caf5ff488d2a3c8f5fd9b1c9fa943b438ab97f47f35a9350139ee47"} Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.475851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" event={"ID":"e372bf84-9d99-45cc-9225-7ad37b0c60b8","Type":"ContainerStarted","Data":"f5251ac4553e02f65484e49cf4175fa4a91e120f208be16b79ca65ebe5aa3111"} Sep 30 17:27:26 crc kubenswrapper[4778]: I0930 17:27:26.477953 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" event={"ID":"0803c4d3-0db6-48be-bc51-a6f24b97ed36","Type":"ContainerStarted","Data":"a3afc7c4b4acfe5db5deb040179d82116c860d47c19bc18371b058af66f5d41b"} Sep 30 17:27:29 crc kubenswrapper[4778]: I0930 17:27:29.500797 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" event={"ID":"0803c4d3-0db6-48be-bc51-a6f24b97ed36","Type":"ContainerStarted","Data":"7b8646b60beeab1c292666ed8b6cd72bbb522db6886668f6d5ecfd711a47e55d"} Sep 30 17:27:29 crc kubenswrapper[4778]: I0930 17:27:29.501726 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:29 crc kubenswrapper[4778]: I0930 17:27:29.528409 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" podStartSLOduration=2.437168383 podStartE2EDuration="4.528368346s" podCreationTimestamp="2025-09-30 17:27:25 +0000 UTC" firstStartedPulling="2025-09-30 17:27:26.106988756 +0000 UTC m=+585.096886559" lastFinishedPulling="2025-09-30 17:27:28.198188689 +0000 UTC m=+587.188086522" observedRunningTime="2025-09-30 17:27:29.522427719 +0000 UTC m=+588.512325522" watchObservedRunningTime="2025-09-30 17:27:29.528368346 +0000 UTC m=+588.518266159" Sep 30 17:27:30 crc kubenswrapper[4778]: I0930 17:27:30.515556 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-bkzbw" event={"ID":"a646c0c3-0f15-434d-a414-f1523b29aba5","Type":"ContainerStarted","Data":"95ca976f4756a1ea8e33ecefb3322e73bff4ba8eb039ceceeed78e3ce920180d"} Sep 30 17:27:30 crc kubenswrapper[4778]: I0930 17:27:30.517545 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" event={"ID":"e372bf84-9d99-45cc-9225-7ad37b0c60b8","Type":"ContainerStarted","Data":"35f5eef4f0359dec6fa7336371d525d714cb2c8778f75ebb8e63787e2c7d8474"} Sep 30 17:27:30 crc kubenswrapper[4778]: I0930 17:27:30.537785 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-bkzbw" podStartSLOduration=2.025653098 podStartE2EDuration="5.537754325s" podCreationTimestamp="2025-09-30 17:27:25 +0000 UTC" firstStartedPulling="2025-09-30 17:27:26.135885642 +0000 UTC m=+585.125783435" lastFinishedPulling="2025-09-30 17:27:29.647986849 +0000 UTC m=+588.637884662" observedRunningTime="2025-09-30 17:27:30.532052914 +0000 UTC m=+589.521950727" watchObservedRunningTime="2025-09-30 17:27:30.537754325 +0000 UTC m=+589.527652138" Sep 30 17:27:30 crc kubenswrapper[4778]: I0930 17:27:30.552026 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-nslws" podStartSLOduration=1.975541331 podStartE2EDuration="5.551983527s" podCreationTimestamp="2025-09-30 17:27:25 +0000 UTC" firstStartedPulling="2025-09-30 17:27:26.075868799 +0000 UTC m=+585.065766602" lastFinishedPulling="2025-09-30 17:27:29.652310995 +0000 UTC m=+588.642208798" observedRunningTime="2025-09-30 17:27:30.551565923 +0000 UTC m=+589.541463726" watchObservedRunningTime="2025-09-30 17:27:30.551983527 +0000 UTC m=+589.541881330" Sep 30 17:27:35 crc kubenswrapper[4778]: I0930 17:27:35.820218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-vzjnq" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.101332 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kzlfx"] Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.101710 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-controller" containerID="cri-o://b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.102044 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="sbdb" containerID="cri-o://74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.102085 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="nbdb" containerID="cri-o://1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.102128 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="northd" containerID="cri-o://684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.102171 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.102204 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-node" containerID="cri-o://0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.102248 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-acl-logging" containerID="cri-o://fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.158472 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" containerID="cri-o://2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" gracePeriod=30 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.454687 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/3.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.457796 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovn-acl-logging/0.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.458581 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovn-controller/0.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.459528 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.547547 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dpdrh"] Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548103 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548149 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548172 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548190 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548217 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="northd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548238 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="northd" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548262 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-acl-logging" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548279 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-acl-logging" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548298 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-node" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548314 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-node" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548340 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kubecfg-setup" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548358 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kubecfg-setup" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548381 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548396 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548413 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="sbdb" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548429 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="sbdb" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548449 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="nbdb" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548464 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="nbdb" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548493 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548515 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548533 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548550 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.548580 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548597 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548864 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548897 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="nbdb" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548920 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548940 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-node" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548964 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.548988 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549015 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="sbdb" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549041 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovn-acl-logging" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549063 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="northd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549083 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.549312 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549334 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549573 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.549612 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerName="ovnkube-controller" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.552944 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.566773 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/2.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.567690 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/1.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.567741 4778 generic.go:334] "Generic (PLEG): container finished" podID="99e0ced4-d228-4bfa-a263-b8934f0d8e5d" containerID="5555f4f469382c30ef23a0546878aecb39a8bf359d2403619aa573b16c9c3a19" exitCode=2 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.567822 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerDied","Data":"5555f4f469382c30ef23a0546878aecb39a8bf359d2403619aa573b16c9c3a19"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.567928 4778 scope.go:117] "RemoveContainer" containerID="b2ba2dd80ae8843f4b47955ec634603e4a462a2f3480f75ce8a6fcbcb628dd75" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.569129 4778 scope.go:117] "RemoveContainer" containerID="5555f4f469382c30ef23a0546878aecb39a8bf359d2403619aa573b16c9c3a19" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.570423 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vmbxd_openshift-multus(99e0ced4-d228-4bfa-a263-b8934f0d8e5d)\"" pod="openshift-multus/multus-vmbxd" podUID="99e0ced4-d228-4bfa-a263-b8934f0d8e5d" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.571871 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovnkube-controller/3.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.577104 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovn-acl-logging/0.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.578593 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kzlfx_deb38969-9012-468f-87aa-2e70a5f8f3c4/ovn-controller/0.log" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579307 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" exitCode=0 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579340 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" exitCode=0 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579350 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" exitCode=0 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579364 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" exitCode=0 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579373 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" exitCode=0 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579382 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" exitCode=0 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579392 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" exitCode=143 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579406 4778 generic.go:334] "Generic (PLEG): container finished" podID="deb38969-9012-468f-87aa-2e70a5f8f3c4" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" exitCode=143 Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579466 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579486 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579502 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579544 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579559 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579567 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579575 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579582 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579589 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579596 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579602 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579609 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579634 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579655 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579665 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579673 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579681 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579689 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579697 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579705 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579712 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579721 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579730 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579753 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579761 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579768 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579775 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579782 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579790 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579800 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579807 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579815 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579835 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" event={"ID":"deb38969-9012-468f-87aa-2e70a5f8f3c4","Type":"ContainerDied","Data":"a5691e2d24f75ec03534d5562335ee8be013408ab586b08b46003a7c95bc2b07"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579856 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579865 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579872 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579882 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579891 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579899 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579908 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579916 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579923 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.579931 4778 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.580944 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kzlfx" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.581580 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-etc-openvswitch\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.581693 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.581748 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-systemd-units\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.581800 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-node-log\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.581873 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-netns\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582144 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582184 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582252 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-node-log" (OuterVolumeSpecName: "node-log") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582250 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582291 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.581908 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-var-lib-openvswitch\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582393 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-kubelet\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582444 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-script-lib\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582479 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-ovn-kubernetes\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582514 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-bin\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582551 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-netd\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-log-socket\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582638 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-openvswitch\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-systemd\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582725 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx6xj\" (UniqueName: \"kubernetes.io/projected/deb38969-9012-468f-87aa-2e70a5f8f3c4-kube-api-access-mx6xj\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582780 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovn-node-metrics-cert\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582824 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-config\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582875 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-slash\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582958 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-env-overrides\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.582996 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-ovn\") pod \"deb38969-9012-468f-87aa-2e70a5f8f3c4\" (UID: \"deb38969-9012-468f-87aa-2e70a5f8f3c4\") " Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583042 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-log-socket" (OuterVolumeSpecName: "log-socket") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583106 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583146 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583361 4778 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583387 4778 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583409 4778 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583427 4778 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583447 4778 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583466 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583483 4778 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583502 4778 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583551 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.583595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.584192 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.584782 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-slash" (OuterVolumeSpecName: "host-slash") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.585502 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.585589 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.585598 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.585727 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.590212 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.597574 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.597660 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb38969-9012-468f-87aa-2e70a5f8f3c4-kube-api-access-mx6xj" (OuterVolumeSpecName: "kube-api-access-mx6xj") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "kube-api-access-mx6xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.603057 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "deb38969-9012-468f-87aa-2e70a5f8f3c4" (UID: "deb38969-9012-468f-87aa-2e70a5f8f3c4"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.604591 4778 scope.go:117] "RemoveContainer" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.632969 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.651128 4778 scope.go:117] "RemoveContainer" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.667596 4778 scope.go:117] "RemoveContainer" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.684554 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf96x\" (UniqueName: \"kubernetes.io/projected/fc337f2e-c0d8-4c5d-9c78-9b972e225609-kube-api-access-jf96x\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685095 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-cni-bin\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-ovn\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685315 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-kubelet\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685384 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-env-overrides\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-systemd\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685481 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-log-socket\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685601 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-run-ovn-kubernetes\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685668 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovnkube-script-lib\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.685705 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovn-node-metrics-cert\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686180 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686375 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-cni-netd\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-run-netns\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686461 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-slash\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686509 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovnkube-config\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686549 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686592 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-systemd-units\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686673 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-etc-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686707 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-node-log\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686742 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-var-lib-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686831 4778 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686849 4778 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686865 4778 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686883 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686895 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686910 4778 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686924 4778 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686938 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx6xj\" (UniqueName: \"kubernetes.io/projected/deb38969-9012-468f-87aa-2e70a5f8f3c4-kube-api-access-mx6xj\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686951 4778 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686963 4778 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686975 4778 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/deb38969-9012-468f-87aa-2e70a5f8f3c4-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.686987 4778 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/deb38969-9012-468f-87aa-2e70a5f8f3c4-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.691147 4778 scope.go:117] "RemoveContainer" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.713774 4778 scope.go:117] "RemoveContainer" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.734795 4778 scope.go:117] "RemoveContainer" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.756692 4778 scope.go:117] "RemoveContainer" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.778583 4778 scope.go:117] "RemoveContainer" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789215 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-ovn\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789275 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-kubelet\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789323 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-env-overrides\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789360 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-systemd\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789410 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-kubelet\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789431 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-log-socket\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789438 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-ovn\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789521 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-log-socket\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789665 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-systemd\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789733 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-run-ovn-kubernetes\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789792 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-run-ovn-kubernetes\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789866 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovnkube-script-lib\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.789984 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovn-node-metrics-cert\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.790111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.790297 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-cni-netd\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.790447 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-run-netns\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.790761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-slash\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.790838 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-cni-netd\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.790764 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-env-overrides\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791097 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-run-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791160 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-run-netns\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovnkube-config\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791344 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-systemd-units\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791479 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-node-log\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791498 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-slash\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovnkube-script-lib\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791673 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-node-log\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791608 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-var-lib-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791803 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-systemd-units\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791818 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-var-lib-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-etc-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791962 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jf96x\" (UniqueName: \"kubernetes.io/projected/fc337f2e-c0d8-4c5d-9c78-9b972e225609-kube-api-access-jf96x\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.792024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-cni-bin\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.791966 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-etc-openvswitch\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.792704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fc337f2e-c0d8-4c5d-9c78-9b972e225609-host-cni-bin\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.792956 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovnkube-config\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.795143 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fc337f2e-c0d8-4c5d-9c78-9b972e225609-ovn-node-metrics-cert\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.801854 4778 scope.go:117] "RemoveContainer" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.816340 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf96x\" (UniqueName: \"kubernetes.io/projected/fc337f2e-c0d8-4c5d-9c78-9b972e225609-kube-api-access-jf96x\") pod \"ovnkube-node-dpdrh\" (UID: \"fc337f2e-c0d8-4c5d-9c78-9b972e225609\") " pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.828840 4778 scope.go:117] "RemoveContainer" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.829433 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": container with ID starting with 2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b not found: ID does not exist" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.829493 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} err="failed to get container status \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": rpc error: code = NotFound desc = could not find container \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": container with ID starting with 2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.829535 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.829915 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": container with ID starting with 2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca not found: ID does not exist" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.829976 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} err="failed to get container status \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": rpc error: code = NotFound desc = could not find container \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": container with ID starting with 2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.829996 4778 scope.go:117] "RemoveContainer" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.830406 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": container with ID starting with 74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c not found: ID does not exist" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.830459 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} err="failed to get container status \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": rpc error: code = NotFound desc = could not find container \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": container with ID starting with 74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.830494 4778 scope.go:117] "RemoveContainer" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.830909 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": container with ID starting with 1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33 not found: ID does not exist" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.830933 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} err="failed to get container status \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": rpc error: code = NotFound desc = could not find container \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": container with ID starting with 1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.830954 4778 scope.go:117] "RemoveContainer" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.831356 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": container with ID starting with 684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd not found: ID does not exist" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.831415 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} err="failed to get container status \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": rpc error: code = NotFound desc = could not find container \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": container with ID starting with 684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.831493 4778 scope.go:117] "RemoveContainer" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.831950 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": container with ID starting with 810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145 not found: ID does not exist" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.831989 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} err="failed to get container status \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": rpc error: code = NotFound desc = could not find container \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": container with ID starting with 810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.832029 4778 scope.go:117] "RemoveContainer" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.832403 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": container with ID starting with 0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3 not found: ID does not exist" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.832457 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} err="failed to get container status \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": rpc error: code = NotFound desc = could not find container \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": container with ID starting with 0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.832494 4778 scope.go:117] "RemoveContainer" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.832867 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": container with ID starting with fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab not found: ID does not exist" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.832922 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} err="failed to get container status \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": rpc error: code = NotFound desc = could not find container \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": container with ID starting with fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.832941 4778 scope.go:117] "RemoveContainer" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.833261 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": container with ID starting with b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2 not found: ID does not exist" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.833319 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} err="failed to get container status \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": rpc error: code = NotFound desc = could not find container \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": container with ID starting with b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.833338 4778 scope.go:117] "RemoveContainer" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" Sep 30 17:27:36 crc kubenswrapper[4778]: E0930 17:27:36.833822 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": container with ID starting with f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b not found: ID does not exist" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.833884 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} err="failed to get container status \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": rpc error: code = NotFound desc = could not find container \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": container with ID starting with f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.833933 4778 scope.go:117] "RemoveContainer" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.834404 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} err="failed to get container status \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": rpc error: code = NotFound desc = could not find container \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": container with ID starting with 2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.834463 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.834905 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} err="failed to get container status \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": rpc error: code = NotFound desc = could not find container \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": container with ID starting with 2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.835008 4778 scope.go:117] "RemoveContainer" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.835359 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} err="failed to get container status \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": rpc error: code = NotFound desc = could not find container \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": container with ID starting with 74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.835389 4778 scope.go:117] "RemoveContainer" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.835778 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} err="failed to get container status \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": rpc error: code = NotFound desc = could not find container \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": container with ID starting with 1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.835811 4778 scope.go:117] "RemoveContainer" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.836288 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} err="failed to get container status \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": rpc error: code = NotFound desc = could not find container \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": container with ID starting with 684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.836328 4778 scope.go:117] "RemoveContainer" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.836829 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} err="failed to get container status \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": rpc error: code = NotFound desc = could not find container \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": container with ID starting with 810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.836859 4778 scope.go:117] "RemoveContainer" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.837181 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} err="failed to get container status \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": rpc error: code = NotFound desc = could not find container \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": container with ID starting with 0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.837240 4778 scope.go:117] "RemoveContainer" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.837913 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} err="failed to get container status \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": rpc error: code = NotFound desc = could not find container \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": container with ID starting with fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.837984 4778 scope.go:117] "RemoveContainer" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.838318 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} err="failed to get container status \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": rpc error: code = NotFound desc = could not find container \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": container with ID starting with b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.838358 4778 scope.go:117] "RemoveContainer" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.838965 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} err="failed to get container status \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": rpc error: code = NotFound desc = could not find container \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": container with ID starting with f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.839052 4778 scope.go:117] "RemoveContainer" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.839454 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} err="failed to get container status \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": rpc error: code = NotFound desc = could not find container \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": container with ID starting with 2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.839510 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.839937 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} err="failed to get container status \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": rpc error: code = NotFound desc = could not find container \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": container with ID starting with 2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.840011 4778 scope.go:117] "RemoveContainer" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.840462 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} err="failed to get container status \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": rpc error: code = NotFound desc = could not find container \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": container with ID starting with 74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.840541 4778 scope.go:117] "RemoveContainer" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.840926 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} err="failed to get container status \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": rpc error: code = NotFound desc = could not find container \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": container with ID starting with 1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.840959 4778 scope.go:117] "RemoveContainer" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.841320 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} err="failed to get container status \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": rpc error: code = NotFound desc = could not find container \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": container with ID starting with 684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.841382 4778 scope.go:117] "RemoveContainer" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.841746 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} err="failed to get container status \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": rpc error: code = NotFound desc = could not find container \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": container with ID starting with 810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.841772 4778 scope.go:117] "RemoveContainer" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.841963 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} err="failed to get container status \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": rpc error: code = NotFound desc = could not find container \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": container with ID starting with 0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.841979 4778 scope.go:117] "RemoveContainer" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.842327 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} err="failed to get container status \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": rpc error: code = NotFound desc = could not find container \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": container with ID starting with fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.842395 4778 scope.go:117] "RemoveContainer" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.843026 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} err="failed to get container status \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": rpc error: code = NotFound desc = could not find container \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": container with ID starting with b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.843096 4778 scope.go:117] "RemoveContainer" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.843467 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} err="failed to get container status \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": rpc error: code = NotFound desc = could not find container \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": container with ID starting with f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.843517 4778 scope.go:117] "RemoveContainer" containerID="2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.843964 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b"} err="failed to get container status \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": rpc error: code = NotFound desc = could not find container \"2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b\": container with ID starting with 2e2c9043d627d3f328fcfc1861c04ea8fc4d80b30f971df7f3879fec341b9c3b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.844011 4778 scope.go:117] "RemoveContainer" containerID="2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.844982 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca"} err="failed to get container status \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": rpc error: code = NotFound desc = could not find container \"2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca\": container with ID starting with 2b33e20d080c52728fb95819f8441a050f8e77b734b98843d99d069c978f35ca not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.845026 4778 scope.go:117] "RemoveContainer" containerID="74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.845568 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c"} err="failed to get container status \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": rpc error: code = NotFound desc = could not find container \"74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c\": container with ID starting with 74937f39cc655d32cfbd6db662c13062572df6117d5a2e166e10864a9f9cb36c not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.845608 4778 scope.go:117] "RemoveContainer" containerID="1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.845932 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33"} err="failed to get container status \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": rpc error: code = NotFound desc = could not find container \"1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33\": container with ID starting with 1d16bc1b25ca0dfcfb72482324fabbf7fb4dedb84911b2d1b3369eae432cbd33 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.845965 4778 scope.go:117] "RemoveContainer" containerID="684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.846227 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd"} err="failed to get container status \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": rpc error: code = NotFound desc = could not find container \"684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd\": container with ID starting with 684bc32512f037c0e21cd287ade9afc88e83d1d97b4ffde467ebdd47ee33c6dd not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.846256 4778 scope.go:117] "RemoveContainer" containerID="810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.846646 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145"} err="failed to get container status \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": rpc error: code = NotFound desc = could not find container \"810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145\": container with ID starting with 810f1e834cce927d3474bbfeec93be794364ba6d2215112a7c7828f3f7e44145 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.846683 4778 scope.go:117] "RemoveContainer" containerID="0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.847071 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3"} err="failed to get container status \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": rpc error: code = NotFound desc = could not find container \"0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3\": container with ID starting with 0315c11b141921a65d345804a37085a6a49a782e2709ed0e86452f86468c72d3 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.847114 4778 scope.go:117] "RemoveContainer" containerID="fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.847393 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab"} err="failed to get container status \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": rpc error: code = NotFound desc = could not find container \"fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab\": container with ID starting with fbb3e92eb019ed2159b27a25769ce2cc68055423c7ce8adcc22d2fb0ba693aab not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.847453 4778 scope.go:117] "RemoveContainer" containerID="b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.848252 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2"} err="failed to get container status \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": rpc error: code = NotFound desc = could not find container \"b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2\": container with ID starting with b4f50c3c56d264cf94c198956b0e249d0b5a80bf0123ff1a69f48d9040b9fbf2 not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.848300 4778 scope.go:117] "RemoveContainer" containerID="f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.848673 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b"} err="failed to get container status \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": rpc error: code = NotFound desc = could not find container \"f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b\": container with ID starting with f9c622f29debe1a221f7ec5fc2563492dfa8467e0f1705df37f57db042ee810b not found: ID does not exist" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.881417 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.917554 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kzlfx"] Sep 30 17:27:36 crc kubenswrapper[4778]: I0930 17:27:36.921477 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kzlfx"] Sep 30 17:27:37 crc kubenswrapper[4778]: I0930 17:27:37.601785 4778 generic.go:334] "Generic (PLEG): container finished" podID="fc337f2e-c0d8-4c5d-9c78-9b972e225609" containerID="4b7ca835c2a0c7aae7919dfcee51e301ddef66a1b8edb8eb573d28d26f505fe7" exitCode=0 Sep 30 17:27:37 crc kubenswrapper[4778]: I0930 17:27:37.602405 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerDied","Data":"4b7ca835c2a0c7aae7919dfcee51e301ddef66a1b8edb8eb573d28d26f505fe7"} Sep 30 17:27:37 crc kubenswrapper[4778]: I0930 17:27:37.604152 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"296c64184ab28d36212521a32a66a3ee35fa8a11ef732a30aa81b2ce7a40685c"} Sep 30 17:27:37 crc kubenswrapper[4778]: I0930 17:27:37.642571 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/2.log" Sep 30 17:27:37 crc kubenswrapper[4778]: I0930 17:27:37.725359 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb38969-9012-468f-87aa-2e70a5f8f3c4" path="/var/lib/kubelet/pods/deb38969-9012-468f-87aa-2e70a5f8f3c4/volumes" Sep 30 17:27:38 crc kubenswrapper[4778]: I0930 17:27:38.656904 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"bdffc2f8cfe2d9b7d471d2ca610a0b6ac03b0a8f120591b3408573c11ef078b2"} Sep 30 17:27:38 crc kubenswrapper[4778]: I0930 17:27:38.657513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"a742e3ced5c5bb9941f54fea255a128e8bffd76d2f40c54eafcdb5222c7f625b"} Sep 30 17:27:38 crc kubenswrapper[4778]: I0930 17:27:38.657552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"ff653320cb0a64bd9d3a830249bbbf8d1aa0c2d07c6c3b365ef0ad80e947d004"} Sep 30 17:27:38 crc kubenswrapper[4778]: I0930 17:27:38.657574 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"a67f0b01d3157d62b5458590e881b54e2f74f8d270a6a8bcd3b906ab1d587ff4"} Sep 30 17:27:38 crc kubenswrapper[4778]: I0930 17:27:38.657596 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"945d2d49074f0dd7749cc5c6d73259ac218794231b3c588407b663a64c448483"} Sep 30 17:27:38 crc kubenswrapper[4778]: I0930 17:27:38.657641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"c15ea2d504696ca050172fdd82021e2854e5e861171c513f4053ae1201f53aee"} Sep 30 17:27:41 crc kubenswrapper[4778]: I0930 17:27:41.685444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"5682facc6bb7c8d14f623f381bb49080a4fc4a767041bdf7f5d16efdd4bf37b2"} Sep 30 17:27:43 crc kubenswrapper[4778]: I0930 17:27:43.704569 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" event={"ID":"fc337f2e-c0d8-4c5d-9c78-9b972e225609","Type":"ContainerStarted","Data":"a1eb3f848adc0ebfc5a3c82ba3c9f7eb2aa4f6eb05babe63757f3a25c9ac96dc"} Sep 30 17:27:43 crc kubenswrapper[4778]: I0930 17:27:43.705094 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:43 crc kubenswrapper[4778]: I0930 17:27:43.705112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:43 crc kubenswrapper[4778]: I0930 17:27:43.736298 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:43 crc kubenswrapper[4778]: I0930 17:27:43.742708 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" podStartSLOduration=7.742678465 podStartE2EDuration="7.742678465s" podCreationTimestamp="2025-09-30 17:27:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:27:43.73998406 +0000 UTC m=+602.729881883" watchObservedRunningTime="2025-09-30 17:27:43.742678465 +0000 UTC m=+602.732576268" Sep 30 17:27:44 crc kubenswrapper[4778]: I0930 17:27:44.713690 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:44 crc kubenswrapper[4778]: I0930 17:27:44.759496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:27:44 crc kubenswrapper[4778]: I0930 17:27:44.812178 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:27:44 crc kubenswrapper[4778]: I0930 17:27:44.812263 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:27:47 crc kubenswrapper[4778]: I0930 17:27:47.714506 4778 scope.go:117] "RemoveContainer" containerID="5555f4f469382c30ef23a0546878aecb39a8bf359d2403619aa573b16c9c3a19" Sep 30 17:27:47 crc kubenswrapper[4778]: E0930 17:27:47.715283 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-vmbxd_openshift-multus(99e0ced4-d228-4bfa-a263-b8934f0d8e5d)\"" pod="openshift-multus/multus-vmbxd" podUID="99e0ced4-d228-4bfa-a263-b8934f0d8e5d" Sep 30 17:28:01 crc kubenswrapper[4778]: I0930 17:28:01.717017 4778 scope.go:117] "RemoveContainer" containerID="5555f4f469382c30ef23a0546878aecb39a8bf359d2403619aa573b16c9c3a19" Sep 30 17:28:02 crc kubenswrapper[4778]: I0930 17:28:02.839846 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vmbxd_99e0ced4-d228-4bfa-a263-b8934f0d8e5d/kube-multus/2.log" Sep 30 17:28:02 crc kubenswrapper[4778]: I0930 17:28:02.839933 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vmbxd" event={"ID":"99e0ced4-d228-4bfa-a263-b8934f0d8e5d","Type":"ContainerStarted","Data":"96c07e2e3e14bc29ab328f9e912097c67761940f305806d69e4cae050a3fe800"} Sep 30 17:28:06 crc kubenswrapper[4778]: I0930 17:28:06.922706 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dpdrh" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.450804 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7"] Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.453294 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.456027 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.466834 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7"] Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.598236 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.598358 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78pqg\" (UniqueName: \"kubernetes.io/projected/f131134d-496f-4a7b-849b-89d1c3100208-kube-api-access-78pqg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.598387 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.700189 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78pqg\" (UniqueName: \"kubernetes.io/projected/f131134d-496f-4a7b-849b-89d1c3100208-kube-api-access-78pqg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.700269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.700315 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.701091 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.701147 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.728677 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78pqg\" (UniqueName: \"kubernetes.io/projected/f131134d-496f-4a7b-849b-89d1c3100208-kube-api-access-78pqg\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.772516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.812301 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.812405 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.812482 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.814599 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"87ec9413bfb27167167aeb00914e82262471cb92624b3bf11492e1b4663098b9"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.814691 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://87ec9413bfb27167167aeb00914e82262471cb92624b3bf11492e1b4663098b9" gracePeriod=600 Sep 30 17:28:14 crc kubenswrapper[4778]: I0930 17:28:14.993946 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7"] Sep 30 17:28:15 crc kubenswrapper[4778]: W0930 17:28:15.014801 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf131134d_496f_4a7b_849b_89d1c3100208.slice/crio-937c5038e40c747d02e04264aa62e58cb70c0472dd3cd2f0014fc5035b860534 WatchSource:0}: Error finding container 937c5038e40c747d02e04264aa62e58cb70c0472dd3cd2f0014fc5035b860534: Status 404 returned error can't find the container with id 937c5038e40c747d02e04264aa62e58cb70c0472dd3cd2f0014fc5035b860534 Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.932680 4778 generic.go:334] "Generic (PLEG): container finished" podID="f131134d-496f-4a7b-849b-89d1c3100208" containerID="5bf2acb25ce3b9c960d69115c8f008c3ee477637bb78f905a2ade6d305a7eeae" exitCode=0 Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.932798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" event={"ID":"f131134d-496f-4a7b-849b-89d1c3100208","Type":"ContainerDied","Data":"5bf2acb25ce3b9c960d69115c8f008c3ee477637bb78f905a2ade6d305a7eeae"} Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.933162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" event={"ID":"f131134d-496f-4a7b-849b-89d1c3100208","Type":"ContainerStarted","Data":"937c5038e40c747d02e04264aa62e58cb70c0472dd3cd2f0014fc5035b860534"} Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.939745 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="87ec9413bfb27167167aeb00914e82262471cb92624b3bf11492e1b4663098b9" exitCode=0 Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.939820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"87ec9413bfb27167167aeb00914e82262471cb92624b3bf11492e1b4663098b9"} Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.939881 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"7e8e94b4bfe4e71036adf0980559cf4826c826e3aed3c9f8f0d61aee964cec6e"} Sep 30 17:28:15 crc kubenswrapper[4778]: I0930 17:28:15.939917 4778 scope.go:117] "RemoveContainer" containerID="13a00401a21c43e81f99eb39c3b8e765d2e6610548614249a56f21e67091808b" Sep 30 17:28:18 crc kubenswrapper[4778]: I0930 17:28:18.965977 4778 generic.go:334] "Generic (PLEG): container finished" podID="f131134d-496f-4a7b-849b-89d1c3100208" containerID="4bcff240b791397200f989cc8f0b6da79609228334150b5efba6e5033bb0e620" exitCode=0 Sep 30 17:28:18 crc kubenswrapper[4778]: I0930 17:28:18.966176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" event={"ID":"f131134d-496f-4a7b-849b-89d1c3100208","Type":"ContainerDied","Data":"4bcff240b791397200f989cc8f0b6da79609228334150b5efba6e5033bb0e620"} Sep 30 17:28:19 crc kubenswrapper[4778]: I0930 17:28:19.980575 4778 generic.go:334] "Generic (PLEG): container finished" podID="f131134d-496f-4a7b-849b-89d1c3100208" containerID="5c5522cf63354cd4659124c1097f10844bdb26f80979e9607fa085f0b6298224" exitCode=0 Sep 30 17:28:19 crc kubenswrapper[4778]: I0930 17:28:19.980889 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" event={"ID":"f131134d-496f-4a7b-849b-89d1c3100208","Type":"ContainerDied","Data":"5c5522cf63354cd4659124c1097f10844bdb26f80979e9607fa085f0b6298224"} Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.314792 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.416552 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78pqg\" (UniqueName: \"kubernetes.io/projected/f131134d-496f-4a7b-849b-89d1c3100208-kube-api-access-78pqg\") pod \"f131134d-496f-4a7b-849b-89d1c3100208\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.416733 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-util\") pod \"f131134d-496f-4a7b-849b-89d1c3100208\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.416790 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-bundle\") pod \"f131134d-496f-4a7b-849b-89d1c3100208\" (UID: \"f131134d-496f-4a7b-849b-89d1c3100208\") " Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.417635 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-bundle" (OuterVolumeSpecName: "bundle") pod "f131134d-496f-4a7b-849b-89d1c3100208" (UID: "f131134d-496f-4a7b-849b-89d1c3100208"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.427581 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f131134d-496f-4a7b-849b-89d1c3100208-kube-api-access-78pqg" (OuterVolumeSpecName: "kube-api-access-78pqg") pod "f131134d-496f-4a7b-849b-89d1c3100208" (UID: "f131134d-496f-4a7b-849b-89d1c3100208"). InnerVolumeSpecName "kube-api-access-78pqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.432253 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-util" (OuterVolumeSpecName: "util") pod "f131134d-496f-4a7b-849b-89d1c3100208" (UID: "f131134d-496f-4a7b-849b-89d1c3100208"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.518820 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78pqg\" (UniqueName: \"kubernetes.io/projected/f131134d-496f-4a7b-849b-89d1c3100208-kube-api-access-78pqg\") on node \"crc\" DevicePath \"\"" Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.518878 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:28:21 crc kubenswrapper[4778]: I0930 17:28:21.518898 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f131134d-496f-4a7b-849b-89d1c3100208-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:28:22 crc kubenswrapper[4778]: I0930 17:28:22.000471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" event={"ID":"f131134d-496f-4a7b-849b-89d1c3100208","Type":"ContainerDied","Data":"937c5038e40c747d02e04264aa62e58cb70c0472dd3cd2f0014fc5035b860534"} Sep 30 17:28:22 crc kubenswrapper[4778]: I0930 17:28:22.001051 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="937c5038e40c747d02e04264aa62e58cb70c0472dd3cd2f0014fc5035b860534" Sep 30 17:28:22 crc kubenswrapper[4778]: I0930 17:28:22.000545 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.037222 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv"] Sep 30 17:28:23 crc kubenswrapper[4778]: E0930 17:28:23.037473 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="util" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.037486 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="util" Sep 30 17:28:23 crc kubenswrapper[4778]: E0930 17:28:23.037506 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="pull" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.037512 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="pull" Sep 30 17:28:23 crc kubenswrapper[4778]: E0930 17:28:23.037523 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="extract" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.037530 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="extract" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.037645 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f131134d-496f-4a7b-849b-89d1c3100208" containerName="extract" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.038180 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.040581 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.040604 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-58whl" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.040632 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.049566 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv"] Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.140266 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75mf\" (UniqueName: \"kubernetes.io/projected/d0be6779-453e-42b4-b20c-85d16706300f-kube-api-access-r75mf\") pod \"nmstate-operator-5d6f6cfd66-ktwgv\" (UID: \"d0be6779-453e-42b4-b20c-85d16706300f\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.241842 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r75mf\" (UniqueName: \"kubernetes.io/projected/d0be6779-453e-42b4-b20c-85d16706300f-kube-api-access-r75mf\") pod \"nmstate-operator-5d6f6cfd66-ktwgv\" (UID: \"d0be6779-453e-42b4-b20c-85d16706300f\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.276149 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75mf\" (UniqueName: \"kubernetes.io/projected/d0be6779-453e-42b4-b20c-85d16706300f-kube-api-access-r75mf\") pod \"nmstate-operator-5d6f6cfd66-ktwgv\" (UID: \"d0be6779-453e-42b4-b20c-85d16706300f\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.355004 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" Sep 30 17:28:23 crc kubenswrapper[4778]: I0930 17:28:23.613607 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv"] Sep 30 17:28:24 crc kubenswrapper[4778]: I0930 17:28:24.016685 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" event={"ID":"d0be6779-453e-42b4-b20c-85d16706300f","Type":"ContainerStarted","Data":"92d10788bebcf8f87c9e1e03b56bd45672138a720d6b7a9979b844ab6c32501a"} Sep 30 17:28:27 crc kubenswrapper[4778]: I0930 17:28:27.040844 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" event={"ID":"d0be6779-453e-42b4-b20c-85d16706300f","Type":"ContainerStarted","Data":"7b3b1c62dcf09a2d9f61742f6ebbf0ed610282a822f77c279ee9a186bafc5e08"} Sep 30 17:28:27 crc kubenswrapper[4778]: I0930 17:28:27.071905 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-ktwgv" podStartSLOduration=1.239880243 podStartE2EDuration="4.07188567s" podCreationTimestamp="2025-09-30 17:28:23 +0000 UTC" firstStartedPulling="2025-09-30 17:28:23.625551109 +0000 UTC m=+642.615448912" lastFinishedPulling="2025-09-30 17:28:26.457556536 +0000 UTC m=+645.447454339" observedRunningTime="2025-09-30 17:28:27.070776026 +0000 UTC m=+646.060673869" watchObservedRunningTime="2025-09-30 17:28:27.07188567 +0000 UTC m=+646.061783473" Sep 30 17:28:27 crc kubenswrapper[4778]: I0930 17:28:27.993918 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-zzvns"] Sep 30 17:28:27 crc kubenswrapper[4778]: I0930 17:28:27.996680 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.015277 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.016029 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/69e5d000-ba92-4853-9c50-db1681a3f87a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.016184 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f78ck\" (UniqueName: \"kubernetes.io/projected/69e5d000-ba92-4853-9c50-db1681a3f87a-kube-api-access-f78ck\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.016756 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.016821 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.017161 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-82nwj" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.023870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-zzvns"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.030822 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-hkkh6"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.031811 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.038965 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117485 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-nmstate-lock\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117559 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zct28\" (UniqueName: \"kubernetes.io/projected/87413a2b-414e-464c-9806-ea029bd57019-kube-api-access-zct28\") pod \"nmstate-metrics-58fcddf996-tlb8j\" (UID: \"87413a2b-414e-464c-9806-ea029bd57019\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117608 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-ovs-socket\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117689 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f78ck\" (UniqueName: \"kubernetes.io/projected/69e5d000-ba92-4853-9c50-db1681a3f87a-kube-api-access-f78ck\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117737 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-dbus-socket\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117790 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/69e5d000-ba92-4853-9c50-db1681a3f87a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.117860 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxlx\" (UniqueName: \"kubernetes.io/projected/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-kube-api-access-clxlx\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: E0930 17:28:28.117972 4778 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Sep 30 17:28:28 crc kubenswrapper[4778]: E0930 17:28:28.118051 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69e5d000-ba92-4853-9c50-db1681a3f87a-tls-key-pair podName:69e5d000-ba92-4853-9c50-db1681a3f87a nodeName:}" failed. No retries permitted until 2025-09-30 17:28:28.618027245 +0000 UTC m=+647.607925048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/69e5d000-ba92-4853-9c50-db1681a3f87a-tls-key-pair") pod "nmstate-webhook-6d689559c5-zzvns" (UID: "69e5d000-ba92-4853-9c50-db1681a3f87a") : secret "openshift-nmstate-webhook" not found Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.136800 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.137629 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.162887 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.162881 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.179792 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cjhts" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.185496 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.187970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f78ck\" (UniqueName: \"kubernetes.io/projected/69e5d000-ba92-4853-9c50-db1681a3f87a-kube-api-access-f78ck\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220454 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac265e30-ad2e-426f-a657-35ca285dc557-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220544 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxlx\" (UniqueName: \"kubernetes.io/projected/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-kube-api-access-clxlx\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-nmstate-lock\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zct28\" (UniqueName: \"kubernetes.io/projected/87413a2b-414e-464c-9806-ea029bd57019-kube-api-access-zct28\") pod \"nmstate-metrics-58fcddf996-tlb8j\" (UID: \"87413a2b-414e-464c-9806-ea029bd57019\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220674 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-ovs-socket\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220709 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac265e30-ad2e-426f-a657-35ca285dc557-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220729 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-dbus-socket\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220752 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56mkt\" (UniqueName: \"kubernetes.io/projected/ac265e30-ad2e-426f-a657-35ca285dc557-kube-api-access-56mkt\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-nmstate-lock\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.220896 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-ovs-socket\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.221499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-dbus-socket\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.241485 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zct28\" (UniqueName: \"kubernetes.io/projected/87413a2b-414e-464c-9806-ea029bd57019-kube-api-access-zct28\") pod \"nmstate-metrics-58fcddf996-tlb8j\" (UID: \"87413a2b-414e-464c-9806-ea029bd57019\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.248509 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxlx\" (UniqueName: \"kubernetes.io/projected/5afb8964-3c8d-4da3-a57e-e8db0aae4b6d-kube-api-access-clxlx\") pod \"nmstate-handler-hkkh6\" (UID: \"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d\") " pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.322193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac265e30-ad2e-426f-a657-35ca285dc557-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.323283 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ac265e30-ad2e-426f-a657-35ca285dc557-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.323538 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac265e30-ad2e-426f-a657-35ca285dc557-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: E0930 17:28:28.323650 4778 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 30 17:28:28 crc kubenswrapper[4778]: E0930 17:28:28.323704 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac265e30-ad2e-426f-a657-35ca285dc557-plugin-serving-cert podName:ac265e30-ad2e-426f-a657-35ca285dc557 nodeName:}" failed. No retries permitted until 2025-09-30 17:28:28.823687304 +0000 UTC m=+647.813585107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/ac265e30-ad2e-426f-a657-35ca285dc557-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-hvhll" (UID: "ac265e30-ad2e-426f-a657-35ca285dc557") : secret "plugin-serving-cert" not found Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.324035 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56mkt\" (UniqueName: \"kubernetes.io/projected/ac265e30-ad2e-426f-a657-35ca285dc557-kube-api-access-56mkt\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.331574 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.350104 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.355815 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56mkt\" (UniqueName: \"kubernetes.io/projected/ac265e30-ad2e-426f-a657-35ca285dc557-kube-api-access-56mkt\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: W0930 17:28:28.424644 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5afb8964_3c8d_4da3_a57e_e8db0aae4b6d.slice/crio-a69a56f37c4ea8b56d13617fac81543ce87280436ce006620eb408fee3dcec3d WatchSource:0}: Error finding container a69a56f37c4ea8b56d13617fac81543ce87280436ce006620eb408fee3dcec3d: Status 404 returned error can't find the container with id a69a56f37c4ea8b56d13617fac81543ce87280436ce006620eb408fee3dcec3d Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.447204 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-74db9d569f-qj4tb"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.448555 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.457130 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74db9d569f-qj4tb"] Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526456 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-serving-cert\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526524 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-oauth-config\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526549 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-service-ca\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526573 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-trusted-ca-bundle\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526597 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f8kv\" (UniqueName: \"kubernetes.io/projected/4baa6b2d-b13e-45e0-b662-ccaf086e8021-kube-api-access-4f8kv\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526645 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-oauth-serving-cert\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.526672 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-config\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.603508 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j"] Sep 30 17:28:28 crc kubenswrapper[4778]: W0930 17:28:28.611995 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87413a2b_414e_464c_9806_ea029bd57019.slice/crio-7b31b668b8eb7851b90fdd8f7050176be867779ad5811c3e869b1d4a7058e7c2 WatchSource:0}: Error finding container 7b31b668b8eb7851b90fdd8f7050176be867779ad5811c3e869b1d4a7058e7c2: Status 404 returned error can't find the container with id 7b31b668b8eb7851b90fdd8f7050176be867779ad5811c3e869b1d4a7058e7c2 Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627550 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-serving-cert\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627631 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-oauth-config\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627654 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-service-ca\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627680 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-trusted-ca-bundle\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627704 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f8kv\" (UniqueName: \"kubernetes.io/projected/4baa6b2d-b13e-45e0-b662-ccaf086e8021-kube-api-access-4f8kv\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627723 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-oauth-serving-cert\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627741 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/69e5d000-ba92-4853-9c50-db1681a3f87a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.627763 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-config\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.629375 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-config\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.629375 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-service-ca\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.629485 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-oauth-serving-cert\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.630079 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4baa6b2d-b13e-45e0-b662-ccaf086e8021-trusted-ca-bundle\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.634063 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-oauth-config\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.634106 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4baa6b2d-b13e-45e0-b662-ccaf086e8021-console-serving-cert\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.634144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/69e5d000-ba92-4853-9c50-db1681a3f87a-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-zzvns\" (UID: \"69e5d000-ba92-4853-9c50-db1681a3f87a\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.645400 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f8kv\" (UniqueName: \"kubernetes.io/projected/4baa6b2d-b13e-45e0-b662-ccaf086e8021-kube-api-access-4f8kv\") pod \"console-74db9d569f-qj4tb\" (UID: \"4baa6b2d-b13e-45e0-b662-ccaf086e8021\") " pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.766155 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.830901 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac265e30-ad2e-426f-a657-35ca285dc557-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.836726 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/ac265e30-ad2e-426f-a657-35ca285dc557-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-hvhll\" (UID: \"ac265e30-ad2e-426f-a657-35ca285dc557\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.924800 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:28 crc kubenswrapper[4778]: I0930 17:28:28.961838 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-74db9d569f-qj4tb"] Sep 30 17:28:28 crc kubenswrapper[4778]: W0930 17:28:28.966446 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4baa6b2d_b13e_45e0_b662_ccaf086e8021.slice/crio-981383fcbe40b5434250154ffdb22ded9ce8d3a77bec77a78d1f01445d78df88 WatchSource:0}: Error finding container 981383fcbe40b5434250154ffdb22ded9ce8d3a77bec77a78d1f01445d78df88: Status 404 returned error can't find the container with id 981383fcbe40b5434250154ffdb22ded9ce8d3a77bec77a78d1f01445d78df88 Sep 30 17:28:29 crc kubenswrapper[4778]: I0930 17:28:29.054747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" event={"ID":"87413a2b-414e-464c-9806-ea029bd57019","Type":"ContainerStarted","Data":"7b31b668b8eb7851b90fdd8f7050176be867779ad5811c3e869b1d4a7058e7c2"} Sep 30 17:28:29 crc kubenswrapper[4778]: I0930 17:28:29.055827 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74db9d569f-qj4tb" event={"ID":"4baa6b2d-b13e-45e0-b662-ccaf086e8021","Type":"ContainerStarted","Data":"981383fcbe40b5434250154ffdb22ded9ce8d3a77bec77a78d1f01445d78df88"} Sep 30 17:28:29 crc kubenswrapper[4778]: I0930 17:28:29.056987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hkkh6" event={"ID":"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d","Type":"ContainerStarted","Data":"a69a56f37c4ea8b56d13617fac81543ce87280436ce006620eb408fee3dcec3d"} Sep 30 17:28:29 crc kubenswrapper[4778]: I0930 17:28:29.063584 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" Sep 30 17:28:29 crc kubenswrapper[4778]: I0930 17:28:29.244665 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-zzvns"] Sep 30 17:28:29 crc kubenswrapper[4778]: I0930 17:28:29.356640 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll"] Sep 30 17:28:29 crc kubenswrapper[4778]: W0930 17:28:29.364791 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac265e30_ad2e_426f_a657_35ca285dc557.slice/crio-2ed1d92206e89cedf9f588e39bbf77a2337ea9ccabb3458e30dd23a5282adeb0 WatchSource:0}: Error finding container 2ed1d92206e89cedf9f588e39bbf77a2337ea9ccabb3458e30dd23a5282adeb0: Status 404 returned error can't find the container with id 2ed1d92206e89cedf9f588e39bbf77a2337ea9ccabb3458e30dd23a5282adeb0 Sep 30 17:28:30 crc kubenswrapper[4778]: I0930 17:28:30.068459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" event={"ID":"ac265e30-ad2e-426f-a657-35ca285dc557","Type":"ContainerStarted","Data":"2ed1d92206e89cedf9f588e39bbf77a2337ea9ccabb3458e30dd23a5282adeb0"} Sep 30 17:28:30 crc kubenswrapper[4778]: I0930 17:28:30.070043 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" event={"ID":"69e5d000-ba92-4853-9c50-db1681a3f87a","Type":"ContainerStarted","Data":"d860e92ff511f6928f730f4078e63eef1b63b3fe3dd235809867aaead9f20885"} Sep 30 17:28:30 crc kubenswrapper[4778]: I0930 17:28:30.072736 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-74db9d569f-qj4tb" event={"ID":"4baa6b2d-b13e-45e0-b662-ccaf086e8021","Type":"ContainerStarted","Data":"d3d4dd78cbdc0e1d4aa6b3f12a7925b4022c9dc9880094212bfe242fe0c1c272"} Sep 30 17:28:30 crc kubenswrapper[4778]: I0930 17:28:30.105289 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-74db9d569f-qj4tb" podStartSLOduration=2.105266899 podStartE2EDuration="2.105266899s" podCreationTimestamp="2025-09-30 17:28:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:28:30.098453604 +0000 UTC m=+649.088351617" watchObservedRunningTime="2025-09-30 17:28:30.105266899 +0000 UTC m=+649.095164702" Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.089708 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" event={"ID":"87413a2b-414e-464c-9806-ea029bd57019","Type":"ContainerStarted","Data":"a0aa8cf71f9e0d40906e89624c57c785ec041decc54fe0d7277ae1771dabb211"} Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.091739 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" event={"ID":"69e5d000-ba92-4853-9c50-db1681a3f87a","Type":"ContainerStarted","Data":"4e369cfa2a175cff46cc541cc2fa5a1c86ec9d38b2876a8bebe91329589081da"} Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.091977 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.094434 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-hkkh6" event={"ID":"5afb8964-3c8d-4da3-a57e-e8db0aae4b6d","Type":"ContainerStarted","Data":"827b1512d832acdda9722c38b2d05e608e47e25ba6e108176d60c592af71d52c"} Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.094628 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.114571 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" podStartSLOduration=2.997629204 podStartE2EDuration="5.114534649s" podCreationTimestamp="2025-09-30 17:28:27 +0000 UTC" firstStartedPulling="2025-09-30 17:28:29.262026865 +0000 UTC m=+648.251924668" lastFinishedPulling="2025-09-30 17:28:31.3789323 +0000 UTC m=+650.368830113" observedRunningTime="2025-09-30 17:28:32.113117722 +0000 UTC m=+651.103015545" watchObservedRunningTime="2025-09-30 17:28:32.114534649 +0000 UTC m=+651.104432452" Sep 30 17:28:32 crc kubenswrapper[4778]: I0930 17:28:32.140274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-hkkh6" podStartSLOduration=2.189436592 podStartE2EDuration="5.140241717s" podCreationTimestamp="2025-09-30 17:28:27 +0000 UTC" firstStartedPulling="2025-09-30 17:28:28.428260879 +0000 UTC m=+647.418158682" lastFinishedPulling="2025-09-30 17:28:31.379066014 +0000 UTC m=+650.368963807" observedRunningTime="2025-09-30 17:28:32.136811139 +0000 UTC m=+651.126708962" watchObservedRunningTime="2025-09-30 17:28:32.140241717 +0000 UTC m=+651.130139520" Sep 30 17:28:33 crc kubenswrapper[4778]: I0930 17:28:33.103030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" event={"ID":"ac265e30-ad2e-426f-a657-35ca285dc557","Type":"ContainerStarted","Data":"735d1ccb3ce48a1bdaa0f51d549db0412d0908429f12f7f42fd8ea9f52116f9b"} Sep 30 17:28:33 crc kubenswrapper[4778]: I0930 17:28:33.126634 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-hvhll" podStartSLOduration=2.028508015 podStartE2EDuration="5.126579844s" podCreationTimestamp="2025-09-30 17:28:28 +0000 UTC" firstStartedPulling="2025-09-30 17:28:29.367468004 +0000 UTC m=+648.357365807" lastFinishedPulling="2025-09-30 17:28:32.465539823 +0000 UTC m=+651.455437636" observedRunningTime="2025-09-30 17:28:33.125433414 +0000 UTC m=+652.115331227" watchObservedRunningTime="2025-09-30 17:28:33.126579844 +0000 UTC m=+652.116477647" Sep 30 17:28:35 crc kubenswrapper[4778]: I0930 17:28:35.121179 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" event={"ID":"87413a2b-414e-464c-9806-ea029bd57019","Type":"ContainerStarted","Data":"fd084c35d6455929bddcb1dbb81bfe34eff9b80ef5376ec51a17ec24c71c7260"} Sep 30 17:28:35 crc kubenswrapper[4778]: I0930 17:28:35.154300 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tlb8j" podStartSLOduration=2.749975622 podStartE2EDuration="8.154254065s" podCreationTimestamp="2025-09-30 17:28:27 +0000 UTC" firstStartedPulling="2025-09-30 17:28:28.615324429 +0000 UTC m=+647.605222232" lastFinishedPulling="2025-09-30 17:28:34.019602872 +0000 UTC m=+653.009500675" observedRunningTime="2025-09-30 17:28:35.146990819 +0000 UTC m=+654.136888662" watchObservedRunningTime="2025-09-30 17:28:35.154254065 +0000 UTC m=+654.144151908" Sep 30 17:28:38 crc kubenswrapper[4778]: I0930 17:28:38.388879 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-hkkh6" Sep 30 17:28:38 crc kubenswrapper[4778]: I0930 17:28:38.767121 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:38 crc kubenswrapper[4778]: I0930 17:28:38.767236 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:38 crc kubenswrapper[4778]: I0930 17:28:38.780278 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:39 crc kubenswrapper[4778]: I0930 17:28:39.160593 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-74db9d569f-qj4tb" Sep 30 17:28:39 crc kubenswrapper[4778]: I0930 17:28:39.228403 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pl4fp"] Sep 30 17:28:48 crc kubenswrapper[4778]: I0930 17:28:48.933802 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-zzvns" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.253137 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr"] Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.255648 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.257716 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.267869 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr"] Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.312386 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-pl4fp" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerName="console" containerID="cri-o://0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920" gracePeriod=15 Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.409409 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.409524 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.409577 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wn2\" (UniqueName: \"kubernetes.io/projected/0dfd6413-a4b2-4598-bd14-4e9b332e8221-kube-api-access-j7wn2\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.511433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.511752 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wn2\" (UniqueName: \"kubernetes.io/projected/0dfd6413-a4b2-4598-bd14-4e9b332e8221-kube-api-access-j7wn2\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.511811 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.512502 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.513030 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.560979 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wn2\" (UniqueName: \"kubernetes.io/projected/0dfd6413-a4b2-4598-bd14-4e9b332e8221-kube-api-access-j7wn2\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.640151 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.739735 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pl4fp_82aaff94-0c3c-4a1b-be0c-5371c3b60ab0/console/0.log" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.739891 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.895032 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr"] Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918297 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918367 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918400 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918436 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jgczg\" (UniqueName: \"kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918573 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918635 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.918659 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") pod \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\" (UID: \"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0\") " Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.919304 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.919348 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config" (OuterVolumeSpecName: "console-config") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.919424 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.920026 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca" (OuterVolumeSpecName: "service-ca") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.925551 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.926117 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:29:04 crc kubenswrapper[4778]: I0930 17:29:04.926140 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg" (OuterVolumeSpecName: "kube-api-access-jgczg") pod "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" (UID: "82aaff94-0c3c-4a1b-be0c-5371c3b60ab0"). InnerVolumeSpecName "kube-api-access-jgczg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021003 4778 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021079 4778 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021384 4778 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021438 4778 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021453 4778 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021471 4778 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.021484 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jgczg\" (UniqueName: \"kubernetes.io/projected/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0-kube-api-access-jgczg\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.357355 4778 generic.go:334] "Generic (PLEG): container finished" podID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerID="7ece24423ca38df5678a1bc3d8edd98a853ce3949e12a16b267819a3dc1ab4f5" exitCode=0 Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.357437 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" event={"ID":"0dfd6413-a4b2-4598-bd14-4e9b332e8221","Type":"ContainerDied","Data":"7ece24423ca38df5678a1bc3d8edd98a853ce3949e12a16b267819a3dc1ab4f5"} Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.357467 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" event={"ID":"0dfd6413-a4b2-4598-bd14-4e9b332e8221","Type":"ContainerStarted","Data":"80b7ce137f032c9727b4e9b7333a463ecc5ae892285037a4448aac2bebd1ed4c"} Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.359604 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-pl4fp_82aaff94-0c3c-4a1b-be0c-5371c3b60ab0/console/0.log" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.359703 4778 generic.go:334] "Generic (PLEG): container finished" podID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerID="0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920" exitCode=2 Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.359759 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pl4fp" event={"ID":"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0","Type":"ContainerDied","Data":"0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920"} Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.359809 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-pl4fp" event={"ID":"82aaff94-0c3c-4a1b-be0c-5371c3b60ab0","Type":"ContainerDied","Data":"cab6357fe53574e5e2b96c4711518cd3e474b937d37cb4acd5959356abcc1c2f"} Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.359847 4778 scope.go:117] "RemoveContainer" containerID="0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.360050 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-pl4fp" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.389881 4778 scope.go:117] "RemoveContainer" containerID="0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920" Sep 30 17:29:05 crc kubenswrapper[4778]: E0930 17:29:05.391745 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920\": container with ID starting with 0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920 not found: ID does not exist" containerID="0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.391790 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920"} err="failed to get container status \"0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920\": rpc error: code = NotFound desc = could not find container \"0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920\": container with ID starting with 0d5308370a9f62c1a4361ed86a2e0234faba68561f4ed818c88df78bac7a5920 not found: ID does not exist" Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.402721 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-pl4fp"] Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.406002 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-pl4fp"] Sep 30 17:29:05 crc kubenswrapper[4778]: I0930 17:29:05.726982 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" path="/var/lib/kubelet/pods/82aaff94-0c3c-4a1b-be0c-5371c3b60ab0/volumes" Sep 30 17:29:07 crc kubenswrapper[4778]: I0930 17:29:07.377913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" event={"ID":"0dfd6413-a4b2-4598-bd14-4e9b332e8221","Type":"ContainerStarted","Data":"716b9fe6389d4cce9be45f51ab26781e2c6b8706f793eb8073bb7bd5a9a9289d"} Sep 30 17:29:08 crc kubenswrapper[4778]: I0930 17:29:08.387382 4778 generic.go:334] "Generic (PLEG): container finished" podID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerID="716b9fe6389d4cce9be45f51ab26781e2c6b8706f793eb8073bb7bd5a9a9289d" exitCode=0 Sep 30 17:29:08 crc kubenswrapper[4778]: I0930 17:29:08.387775 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" event={"ID":"0dfd6413-a4b2-4598-bd14-4e9b332e8221","Type":"ContainerDied","Data":"716b9fe6389d4cce9be45f51ab26781e2c6b8706f793eb8073bb7bd5a9a9289d"} Sep 30 17:29:09 crc kubenswrapper[4778]: I0930 17:29:09.398183 4778 generic.go:334] "Generic (PLEG): container finished" podID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerID="a4e8cff784ae1623fc831a94b8a7fe64757b52ddfe70c39efb418fd2b7bc3573" exitCode=0 Sep 30 17:29:09 crc kubenswrapper[4778]: I0930 17:29:09.398297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" event={"ID":"0dfd6413-a4b2-4598-bd14-4e9b332e8221","Type":"ContainerDied","Data":"a4e8cff784ae1623fc831a94b8a7fe64757b52ddfe70c39efb418fd2b7bc3573"} Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.718544 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.813575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7wn2\" (UniqueName: \"kubernetes.io/projected/0dfd6413-a4b2-4598-bd14-4e9b332e8221-kube-api-access-j7wn2\") pod \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.813818 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-bundle\") pod \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.813912 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-util\") pod \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\" (UID: \"0dfd6413-a4b2-4598-bd14-4e9b332e8221\") " Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.815595 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-bundle" (OuterVolumeSpecName: "bundle") pod "0dfd6413-a4b2-4598-bd14-4e9b332e8221" (UID: "0dfd6413-a4b2-4598-bd14-4e9b332e8221"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.823486 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dfd6413-a4b2-4598-bd14-4e9b332e8221-kube-api-access-j7wn2" (OuterVolumeSpecName: "kube-api-access-j7wn2") pod "0dfd6413-a4b2-4598-bd14-4e9b332e8221" (UID: "0dfd6413-a4b2-4598-bd14-4e9b332e8221"). InnerVolumeSpecName "kube-api-access-j7wn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.824998 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-util" (OuterVolumeSpecName: "util") pod "0dfd6413-a4b2-4598-bd14-4e9b332e8221" (UID: "0dfd6413-a4b2-4598-bd14-4e9b332e8221"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.915368 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.915433 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7wn2\" (UniqueName: \"kubernetes.io/projected/0dfd6413-a4b2-4598-bd14-4e9b332e8221-kube-api-access-j7wn2\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:10 crc kubenswrapper[4778]: I0930 17:29:10.915454 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0dfd6413-a4b2-4598-bd14-4e9b332e8221-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:29:11 crc kubenswrapper[4778]: I0930 17:29:11.416746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" event={"ID":"0dfd6413-a4b2-4598-bd14-4e9b332e8221","Type":"ContainerDied","Data":"80b7ce137f032c9727b4e9b7333a463ecc5ae892285037a4448aac2bebd1ed4c"} Sep 30 17:29:11 crc kubenswrapper[4778]: I0930 17:29:11.416832 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80b7ce137f032c9727b4e9b7333a463ecc5ae892285037a4448aac2bebd1ed4c" Sep 30 17:29:11 crc kubenswrapper[4778]: I0930 17:29:11.416871 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.182266 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw"] Sep 30 17:29:19 crc kubenswrapper[4778]: E0930 17:29:19.183237 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="pull" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183250 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="pull" Sep 30 17:29:19 crc kubenswrapper[4778]: E0930 17:29:19.183259 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="extract" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183265 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="extract" Sep 30 17:29:19 crc kubenswrapper[4778]: E0930 17:29:19.183277 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="util" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183285 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="util" Sep 30 17:29:19 crc kubenswrapper[4778]: E0930 17:29:19.183295 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerName="console" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183301 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerName="console" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183399 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="82aaff94-0c3c-4a1b-be0c-5371c3b60ab0" containerName="console" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183419 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dfd6413-a4b2-4598-bd14-4e9b332e8221" containerName="extract" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.183935 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.187413 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.188059 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.188119 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.188668 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.188782 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-c87bw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.211258 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw"] Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.249060 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6d40185-14db-4566-b50d-1c05eef2b841-webhook-cert\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.249166 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6d40185-14db-4566-b50d-1c05eef2b841-apiservice-cert\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.249205 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkjj9\" (UniqueName: \"kubernetes.io/projected/e6d40185-14db-4566-b50d-1c05eef2b841-kube-api-access-fkjj9\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.350041 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6d40185-14db-4566-b50d-1c05eef2b841-apiservice-cert\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.350412 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkjj9\" (UniqueName: \"kubernetes.io/projected/e6d40185-14db-4566-b50d-1c05eef2b841-kube-api-access-fkjj9\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.350564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6d40185-14db-4566-b50d-1c05eef2b841-webhook-cert\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.358465 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e6d40185-14db-4566-b50d-1c05eef2b841-apiservice-cert\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.370329 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e6d40185-14db-4566-b50d-1c05eef2b841-webhook-cert\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.372231 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkjj9\" (UniqueName: \"kubernetes.io/projected/e6d40185-14db-4566-b50d-1c05eef2b841-kube-api-access-fkjj9\") pod \"metallb-operator-controller-manager-5759cd8585-r49vw\" (UID: \"e6d40185-14db-4566-b50d-1c05eef2b841\") " pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.506579 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.605025 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt"] Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.606031 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.610355 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.615903 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.616060 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xckks" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.630289 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt"] Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.657492 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/560fc21e-88c9-48b3-8077-5ee5d1056690-webhook-cert\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.657637 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/560fc21e-88c9-48b3-8077-5ee5d1056690-apiservice-cert\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.657686 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx8cw\" (UniqueName: \"kubernetes.io/projected/560fc21e-88c9-48b3-8077-5ee5d1056690-kube-api-access-dx8cw\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.758720 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/560fc21e-88c9-48b3-8077-5ee5d1056690-apiservice-cert\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.759144 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx8cw\" (UniqueName: \"kubernetes.io/projected/560fc21e-88c9-48b3-8077-5ee5d1056690-kube-api-access-dx8cw\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.759221 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/560fc21e-88c9-48b3-8077-5ee5d1056690-webhook-cert\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.776553 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/560fc21e-88c9-48b3-8077-5ee5d1056690-apiservice-cert\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.777172 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/560fc21e-88c9-48b3-8077-5ee5d1056690-webhook-cert\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.784835 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx8cw\" (UniqueName: \"kubernetes.io/projected/560fc21e-88c9-48b3-8077-5ee5d1056690-kube-api-access-dx8cw\") pod \"metallb-operator-webhook-server-65cdcfb5d5-65ppt\" (UID: \"560fc21e-88c9-48b3-8077-5ee5d1056690\") " pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:19 crc kubenswrapper[4778]: I0930 17:29:19.920854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:20 crc kubenswrapper[4778]: I0930 17:29:20.104249 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw"] Sep 30 17:29:20 crc kubenswrapper[4778]: I0930 17:29:20.142524 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt"] Sep 30 17:29:20 crc kubenswrapper[4778]: W0930 17:29:20.153995 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod560fc21e_88c9_48b3_8077_5ee5d1056690.slice/crio-0c2bc9c47f1f1983375892feb387a0c3bed05d6a47ac3ab97fa9cc3159276a3c WatchSource:0}: Error finding container 0c2bc9c47f1f1983375892feb387a0c3bed05d6a47ac3ab97fa9cc3159276a3c: Status 404 returned error can't find the container with id 0c2bc9c47f1f1983375892feb387a0c3bed05d6a47ac3ab97fa9cc3159276a3c Sep 30 17:29:20 crc kubenswrapper[4778]: I0930 17:29:20.469718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" event={"ID":"560fc21e-88c9-48b3-8077-5ee5d1056690","Type":"ContainerStarted","Data":"0c2bc9c47f1f1983375892feb387a0c3bed05d6a47ac3ab97fa9cc3159276a3c"} Sep 30 17:29:20 crc kubenswrapper[4778]: I0930 17:29:20.471579 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" event={"ID":"e6d40185-14db-4566-b50d-1c05eef2b841","Type":"ContainerStarted","Data":"537d5b347e607b9c8469f7e9c084c0d2cc7aa3c41788778c30378273b5a4ed95"} Sep 30 17:29:25 crc kubenswrapper[4778]: I0930 17:29:25.508586 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" event={"ID":"e6d40185-14db-4566-b50d-1c05eef2b841","Type":"ContainerStarted","Data":"6b37b6aa6a4a96691186e4ff80a9ff594ebd47b06d48e0ac86e7596293855c6b"} Sep 30 17:29:25 crc kubenswrapper[4778]: I0930 17:29:25.510468 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" event={"ID":"560fc21e-88c9-48b3-8077-5ee5d1056690","Type":"ContainerStarted","Data":"54c083c70496e5c6a19182a2fb11dd235174c79e12fb33a095c56c28f9b462b6"} Sep 30 17:29:25 crc kubenswrapper[4778]: I0930 17:29:25.510626 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:25 crc kubenswrapper[4778]: I0930 17:29:25.534512 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" podStartSLOduration=1.7990544800000001 podStartE2EDuration="6.534484635s" podCreationTimestamp="2025-09-30 17:29:19 +0000 UTC" firstStartedPulling="2025-09-30 17:29:20.121384542 +0000 UTC m=+699.111282345" lastFinishedPulling="2025-09-30 17:29:24.856814697 +0000 UTC m=+703.846712500" observedRunningTime="2025-09-30 17:29:25.531655254 +0000 UTC m=+704.521553067" watchObservedRunningTime="2025-09-30 17:29:25.534484635 +0000 UTC m=+704.524382448" Sep 30 17:29:25 crc kubenswrapper[4778]: I0930 17:29:25.555391 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" podStartSLOduration=1.822169041 podStartE2EDuration="6.555364677s" podCreationTimestamp="2025-09-30 17:29:19 +0000 UTC" firstStartedPulling="2025-09-30 17:29:20.160658057 +0000 UTC m=+699.150555860" lastFinishedPulling="2025-09-30 17:29:24.893853693 +0000 UTC m=+703.883751496" observedRunningTime="2025-09-30 17:29:25.551578578 +0000 UTC m=+704.541476401" watchObservedRunningTime="2025-09-30 17:29:25.555364677 +0000 UTC m=+704.545262480" Sep 30 17:29:26 crc kubenswrapper[4778]: I0930 17:29:26.517511 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:29:39 crc kubenswrapper[4778]: I0930 17:29:39.929971 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65cdcfb5d5-65ppt" Sep 30 17:29:59 crc kubenswrapper[4778]: I0930 17:29:59.509890 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5759cd8585-r49vw" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.158340 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.159848 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.163122 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.168654 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.170920 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.264771 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ee697e-477a-4ac8-bfc8-7cd17364369d-config-volume\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.265010 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2psd\" (UniqueName: \"kubernetes.io/projected/62ee697e-477a-4ac8-bfc8-7cd17364369d-kube-api-access-f2psd\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.265081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62ee697e-477a-4ac8-bfc8-7cd17364369d-secret-volume\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.315794 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-26pmr"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.318466 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.325461 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.326757 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.332511 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.333229 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.335701 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-tz69l" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.336378 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.367981 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ee697e-477a-4ac8-bfc8-7cd17364369d-config-volume\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368051 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96xrr\" (UniqueName: \"kubernetes.io/projected/c2bdec16-ac5d-4456-9862-220cd1ee1d40-kube-api-access-96xrr\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368082 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-reloader\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368115 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bdec16-ac5d-4456-9862-220cd1ee1d40-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368150 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/841759dc-7794-4849-8d7a-a16b1674d011-frr-startup\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-frr-sockets\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368204 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2psd\" (UniqueName: \"kubernetes.io/projected/62ee697e-477a-4ac8-bfc8-7cd17364369d-kube-api-access-f2psd\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368229 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62ee697e-477a-4ac8-bfc8-7cd17364369d-secret-volume\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368262 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-metrics\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368286 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/841759dc-7794-4849-8d7a-a16b1674d011-metrics-certs\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368321 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-frr-conf\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.368350 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgxts\" (UniqueName: \"kubernetes.io/projected/841759dc-7794-4849-8d7a-a16b1674d011-kube-api-access-bgxts\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.369528 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ee697e-477a-4ac8-bfc8-7cd17364369d-config-volume\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.369909 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.397432 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62ee697e-477a-4ac8-bfc8-7cd17364369d-secret-volume\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.414733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2psd\" (UniqueName: \"kubernetes.io/projected/62ee697e-477a-4ac8-bfc8-7cd17364369d-kube-api-access-f2psd\") pod \"collect-profiles-29320890-fwlcj\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.426735 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-8h2sm"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.427769 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.431551 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.432046 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.432190 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-t2bgb" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.435838 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.454989 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-k845h"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.456006 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.460147 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.464533 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-k845h"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470437 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-frr-conf\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgxts\" (UniqueName: \"kubernetes.io/projected/841759dc-7794-4849-8d7a-a16b1674d011-kube-api-access-bgxts\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-metrics-certs\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470542 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96xrr\" (UniqueName: \"kubernetes.io/projected/c2bdec16-ac5d-4456-9862-220cd1ee1d40-kube-api-access-96xrr\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470561 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-reloader\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470578 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-cert\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470599 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bdec16-ac5d-4456-9862-220cd1ee1d40-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470632 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/841759dc-7794-4849-8d7a-a16b1674d011-frr-startup\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470680 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-frr-sockets\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-metrics\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470722 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lshm\" (UniqueName: \"kubernetes.io/projected/c9e84483-49a9-42d3-b721-929a58feda93-kube-api-access-9lshm\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470742 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/841759dc-7794-4849-8d7a-a16b1674d011-metrics-certs\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470759 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9e84483-49a9-42d3-b721-929a58feda93-metallb-excludel2\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-metrics-certs\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.470796 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpqg2\" (UniqueName: \"kubernetes.io/projected/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-kube-api-access-kpqg2\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.471366 4778 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.471442 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2bdec16-ac5d-4456-9862-220cd1ee1d40-cert podName:c2bdec16-ac5d-4456-9862-220cd1ee1d40 nodeName:}" failed. No retries permitted until 2025-09-30 17:30:00.971407218 +0000 UTC m=+739.961305021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c2bdec16-ac5d-4456-9862-220cd1ee1d40-cert") pod "frr-k8s-webhook-server-5478bdb765-p5b64" (UID: "c2bdec16-ac5d-4456-9862-220cd1ee1d40") : secret "frr-k8s-webhook-server-cert" not found Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.471513 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-metrics\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.471650 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-frr-sockets\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.471660 4778 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.471741 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/841759dc-7794-4849-8d7a-a16b1674d011-metrics-certs podName:841759dc-7794-4849-8d7a-a16b1674d011 nodeName:}" failed. No retries permitted until 2025-09-30 17:30:00.971718359 +0000 UTC m=+739.961616152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/841759dc-7794-4849-8d7a-a16b1674d011-metrics-certs") pod "frr-k8s-26pmr" (UID: "841759dc-7794-4849-8d7a-a16b1674d011") : secret "frr-k8s-certs-secret" not found Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.471851 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-frr-conf\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.472074 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/841759dc-7794-4849-8d7a-a16b1674d011-reloader\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.472299 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/841759dc-7794-4849-8d7a-a16b1674d011-frr-startup\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.483914 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.487596 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgxts\" (UniqueName: \"kubernetes.io/projected/841759dc-7794-4849-8d7a-a16b1674d011-kube-api-access-bgxts\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.498376 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96xrr\" (UniqueName: \"kubernetes.io/projected/c2bdec16-ac5d-4456-9862-220cd1ee1d40-kube-api-access-96xrr\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.571865 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-metrics-certs\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.571921 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-cert\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.571968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.572025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lshm\" (UniqueName: \"kubernetes.io/projected/c9e84483-49a9-42d3-b721-929a58feda93-kube-api-access-9lshm\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.572058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9e84483-49a9-42d3-b721-929a58feda93-metallb-excludel2\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.572065 4778 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.572134 4778 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.572151 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-metrics-certs podName:6ac78c57-42f5-4d55-98a0-74b1c3b4beb3 nodeName:}" failed. No retries permitted until 2025-09-30 17:30:01.072124306 +0000 UTC m=+740.062022109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-metrics-certs") pod "controller-5d688f5ffc-k845h" (UID: "6ac78c57-42f5-4d55-98a0-74b1c3b4beb3") : secret "controller-certs-secret" not found Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.572173 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-metrics-certs podName:c9e84483-49a9-42d3-b721-929a58feda93 nodeName:}" failed. No retries permitted until 2025-09-30 17:30:01.072162477 +0000 UTC m=+740.062060280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-metrics-certs") pod "speaker-8h2sm" (UID: "c9e84483-49a9-42d3-b721-929a58feda93") : secret "speaker-certs-secret" not found Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.572079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-metrics-certs\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.572203 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpqg2\" (UniqueName: \"kubernetes.io/projected/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-kube-api-access-kpqg2\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.572253 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:30:00 crc kubenswrapper[4778]: E0930 17:30:00.572285 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist podName:c9e84483-49a9-42d3-b721-929a58feda93 nodeName:}" failed. No retries permitted until 2025-09-30 17:30:01.072275111 +0000 UTC m=+740.062172914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist") pod "speaker-8h2sm" (UID: "c9e84483-49a9-42d3-b721-929a58feda93") : secret "metallb-memberlist" not found Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.573277 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c9e84483-49a9-42d3-b721-929a58feda93-metallb-excludel2\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.575794 4778 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.592777 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpqg2\" (UniqueName: \"kubernetes.io/projected/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-kube-api-access-kpqg2\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.593597 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-cert\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.600096 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lshm\" (UniqueName: \"kubernetes.io/projected/c9e84483-49a9-42d3-b721-929a58feda93-kube-api-access-9lshm\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.722792 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj"] Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.773431 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" event={"ID":"62ee697e-477a-4ac8-bfc8-7cd17364369d","Type":"ContainerStarted","Data":"9f7a3854dfd82f031508b3c81d52821894910b709f80a840244d9c95281ac97e"} Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.977033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bdec16-ac5d-4456-9862-220cd1ee1d40-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.977584 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/841759dc-7794-4849-8d7a-a16b1674d011-metrics-certs\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.984101 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/841759dc-7794-4849-8d7a-a16b1674d011-metrics-certs\") pod \"frr-k8s-26pmr\" (UID: \"841759dc-7794-4849-8d7a-a16b1674d011\") " pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.987358 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2bdec16-ac5d-4456-9862-220cd1ee1d40-cert\") pod \"frr-k8s-webhook-server-5478bdb765-p5b64\" (UID: \"c2bdec16-ac5d-4456-9862-220cd1ee1d40\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:00 crc kubenswrapper[4778]: I0930 17:30:00.987737 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.079711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-metrics-certs\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.079781 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.079841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-metrics-certs\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:01 crc kubenswrapper[4778]: E0930 17:30:01.080065 4778 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 17:30:01 crc kubenswrapper[4778]: E0930 17:30:01.080193 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist podName:c9e84483-49a9-42d3-b721-929a58feda93 nodeName:}" failed. No retries permitted until 2025-09-30 17:30:02.080164627 +0000 UTC m=+741.070062440 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist") pod "speaker-8h2sm" (UID: "c9e84483-49a9-42d3-b721-929a58feda93") : secret "metallb-memberlist" not found Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.083177 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-metrics-certs\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.087273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ac78c57-42f5-4d55-98a0-74b1c3b4beb3-metrics-certs\") pod \"controller-5d688f5ffc-k845h\" (UID: \"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3\") " pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.257123 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.277942 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64"] Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.375703 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.781147 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"24571e0995e6e0758df643d81df466c3426dc12128d1c20601a73b7e60f8cbd5"} Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.782156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" event={"ID":"c2bdec16-ac5d-4456-9862-220cd1ee1d40","Type":"ContainerStarted","Data":"be730adf57a8b3423a3a8e8a0883987b08ac00925b20cfae23495b07a2948ed8"} Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.783591 4778 generic.go:334] "Generic (PLEG): container finished" podID="62ee697e-477a-4ac8-bfc8-7cd17364369d" containerID="a03952ed85879c9f729ddb8fae1e66e749b43c0e4d8793b875a18a1a7fe12a68" exitCode=0 Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.783641 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" event={"ID":"62ee697e-477a-4ac8-bfc8-7cd17364369d","Type":"ContainerDied","Data":"a03952ed85879c9f729ddb8fae1e66e749b43c0e4d8793b875a18a1a7fe12a68"} Sep 30 17:30:01 crc kubenswrapper[4778]: I0930 17:30:01.793882 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-k845h"] Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.132360 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.139428 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c9e84483-49a9-42d3-b721-929a58feda93-memberlist\") pod \"speaker-8h2sm\" (UID: \"c9e84483-49a9-42d3-b721-929a58feda93\") " pod="metallb-system/speaker-8h2sm" Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.266241 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-8h2sm" Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.804527 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h2sm" event={"ID":"c9e84483-49a9-42d3-b721-929a58feda93","Type":"ContainerStarted","Data":"6539fb05211c9fd25bb53c98d7bcb9290c32cf5a54e091826ab8f067b60cb8d9"} Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.804955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h2sm" event={"ID":"c9e84483-49a9-42d3-b721-929a58feda93","Type":"ContainerStarted","Data":"04e683c0c9878e399680ea7195a00782149271ff7d427780deadbbcb86db8197"} Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.823404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-k845h" event={"ID":"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3","Type":"ContainerStarted","Data":"c10b445542fd631d3be053076c567df6ef4f5979cb07fd18ac7d6d3bf0323db6"} Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.823447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-k845h" event={"ID":"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3","Type":"ContainerStarted","Data":"248d691fd06a39acf24f1bdadad5d313a0887027f0e0d62990ee002a233765fb"} Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.823457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-k845h" event={"ID":"6ac78c57-42f5-4d55-98a0-74b1c3b4beb3","Type":"ContainerStarted","Data":"8efca7eae31bb24acf9f5ad48a3ab39d995f28192f904fc24f7f9977425b5cda"} Sep 30 17:30:02 crc kubenswrapper[4778]: I0930 17:30:02.823496 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.095560 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.114315 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-k845h" podStartSLOduration=3.114291733 podStartE2EDuration="3.114291733s" podCreationTimestamp="2025-09-30 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:30:02.845660595 +0000 UTC m=+741.835558398" watchObservedRunningTime="2025-09-30 17:30:03.114291733 +0000 UTC m=+742.104189526" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.149856 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2psd\" (UniqueName: \"kubernetes.io/projected/62ee697e-477a-4ac8-bfc8-7cd17364369d-kube-api-access-f2psd\") pod \"62ee697e-477a-4ac8-bfc8-7cd17364369d\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.149949 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ee697e-477a-4ac8-bfc8-7cd17364369d-config-volume\") pod \"62ee697e-477a-4ac8-bfc8-7cd17364369d\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.150092 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62ee697e-477a-4ac8-bfc8-7cd17364369d-secret-volume\") pod \"62ee697e-477a-4ac8-bfc8-7cd17364369d\" (UID: \"62ee697e-477a-4ac8-bfc8-7cd17364369d\") " Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.151062 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ee697e-477a-4ac8-bfc8-7cd17364369d-config-volume" (OuterVolumeSpecName: "config-volume") pod "62ee697e-477a-4ac8-bfc8-7cd17364369d" (UID: "62ee697e-477a-4ac8-bfc8-7cd17364369d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.156107 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ee697e-477a-4ac8-bfc8-7cd17364369d-kube-api-access-f2psd" (OuterVolumeSpecName: "kube-api-access-f2psd") pod "62ee697e-477a-4ac8-bfc8-7cd17364369d" (UID: "62ee697e-477a-4ac8-bfc8-7cd17364369d"). InnerVolumeSpecName "kube-api-access-f2psd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.156721 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62ee697e-477a-4ac8-bfc8-7cd17364369d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "62ee697e-477a-4ac8-bfc8-7cd17364369d" (UID: "62ee697e-477a-4ac8-bfc8-7cd17364369d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.251959 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2psd\" (UniqueName: \"kubernetes.io/projected/62ee697e-477a-4ac8-bfc8-7cd17364369d-kube-api-access-f2psd\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.251999 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62ee697e-477a-4ac8-bfc8-7cd17364369d-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.252009 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62ee697e-477a-4ac8-bfc8-7cd17364369d-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.829756 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-8h2sm" event={"ID":"c9e84483-49a9-42d3-b721-929a58feda93","Type":"ContainerStarted","Data":"5a685ad6f48c075de72988d187abed59a84d53f96e750b3e10d120a30aaa7791"} Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.830846 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-8h2sm" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.832930 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.833368 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320890-fwlcj" event={"ID":"62ee697e-477a-4ac8-bfc8-7cd17364369d","Type":"ContainerDied","Data":"9f7a3854dfd82f031508b3c81d52821894910b709f80a840244d9c95281ac97e"} Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.833393 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f7a3854dfd82f031508b3c81d52821894910b709f80a840244d9c95281ac97e" Sep 30 17:30:03 crc kubenswrapper[4778]: I0930 17:30:03.850419 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-8h2sm" podStartSLOduration=3.850387873 podStartE2EDuration="3.850387873s" podCreationTimestamp="2025-09-30 17:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:30:03.848381659 +0000 UTC m=+742.838279472" watchObservedRunningTime="2025-09-30 17:30:03.850387873 +0000 UTC m=+742.840285666" Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.572928 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxgcd"] Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.573611 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerName="controller-manager" containerID="cri-o://79703371c7ff4ccfeee833878d766f87ca0dc88ff8b21e6e4a1438e5d17d8d62" gracePeriod=30 Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.664850 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv"] Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.665131 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" podUID="637f30f0-8906-4b9b-bfa8-b356ee2f88d9" containerName="route-controller-manager" containerID="cri-o://1e67e542db52deba749e9acbb733ab4e6200ceaa0286da938afeaf51646f196f" gracePeriod=30 Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.870288 4778 generic.go:334] "Generic (PLEG): container finished" podID="637f30f0-8906-4b9b-bfa8-b356ee2f88d9" containerID="1e67e542db52deba749e9acbb733ab4e6200ceaa0286da938afeaf51646f196f" exitCode=0 Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.870684 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" event={"ID":"637f30f0-8906-4b9b-bfa8-b356ee2f88d9","Type":"ContainerDied","Data":"1e67e542db52deba749e9acbb733ab4e6200ceaa0286da938afeaf51646f196f"} Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.878534 4778 generic.go:334] "Generic (PLEG): container finished" podID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerID="79703371c7ff4ccfeee833878d766f87ca0dc88ff8b21e6e4a1438e5d17d8d62" exitCode=0 Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.879504 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" event={"ID":"c1da094b-f44d-4d9d-9fe8-3d19ea244d09","Type":"ContainerDied","Data":"79703371c7ff4ccfeee833878d766f87ca0dc88ff8b21e6e4a1438e5d17d8d62"} Sep 30 17:30:05 crc kubenswrapper[4778]: I0930 17:30:05.979735 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.098000 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-proxy-ca-bundles\") pod \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.098083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca\") pod \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.098117 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96v5t\" (UniqueName: \"kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t\") pod \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.098160 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert\") pod \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.098235 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config\") pod \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\" (UID: \"c1da094b-f44d-4d9d-9fe8-3d19ea244d09\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.099233 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca" (OuterVolumeSpecName: "client-ca") pod "c1da094b-f44d-4d9d-9fe8-3d19ea244d09" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.099955 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config" (OuterVolumeSpecName: "config") pod "c1da094b-f44d-4d9d-9fe8-3d19ea244d09" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.100030 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "c1da094b-f44d-4d9d-9fe8-3d19ea244d09" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.106420 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t" (OuterVolumeSpecName: "kube-api-access-96v5t") pod "c1da094b-f44d-4d9d-9fe8-3d19ea244d09" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09"). InnerVolumeSpecName "kube-api-access-96v5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.108360 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c1da094b-f44d-4d9d-9fe8-3d19ea244d09" (UID: "c1da094b-f44d-4d9d-9fe8-3d19ea244d09"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.126399 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199019 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48gx2\" (UniqueName: \"kubernetes.io/projected/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-kube-api-access-48gx2\") pod \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199068 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-config\") pod \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199136 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-client-ca\") pod \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199233 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-serving-cert\") pod \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\" (UID: \"637f30f0-8906-4b9b-bfa8-b356ee2f88d9\") " Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199468 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199485 4778 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199495 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199504 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96v5t\" (UniqueName: \"kubernetes.io/projected/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-kube-api-access-96v5t\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.199513 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c1da094b-f44d-4d9d-9fe8-3d19ea244d09-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.200215 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "637f30f0-8906-4b9b-bfa8-b356ee2f88d9" (UID: "637f30f0-8906-4b9b-bfa8-b356ee2f88d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.200718 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-config" (OuterVolumeSpecName: "config") pod "637f30f0-8906-4b9b-bfa8-b356ee2f88d9" (UID: "637f30f0-8906-4b9b-bfa8-b356ee2f88d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.204031 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "637f30f0-8906-4b9b-bfa8-b356ee2f88d9" (UID: "637f30f0-8906-4b9b-bfa8-b356ee2f88d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.204277 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-kube-api-access-48gx2" (OuterVolumeSpecName: "kube-api-access-48gx2") pod "637f30f0-8906-4b9b-bfa8-b356ee2f88d9" (UID: "637f30f0-8906-4b9b-bfa8-b356ee2f88d9"). InnerVolumeSpecName "kube-api-access-48gx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.300361 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48gx2\" (UniqueName: \"kubernetes.io/projected/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-kube-api-access-48gx2\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.300414 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.300428 4778 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.300444 4778 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/637f30f0-8906-4b9b-bfa8-b356ee2f88d9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.900854 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.905318 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv" event={"ID":"637f30f0-8906-4b9b-bfa8-b356ee2f88d9","Type":"ContainerDied","Data":"d8affaeddf0425e1f3a90b0165045686c3cbbfe4e9954a67dbf1b63cb442bf50"} Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.905384 4778 scope.go:117] "RemoveContainer" containerID="1e67e542db52deba749e9acbb733ab4e6200ceaa0286da938afeaf51646f196f" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.911848 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" event={"ID":"c1da094b-f44d-4d9d-9fe8-3d19ea244d09","Type":"ContainerDied","Data":"58df60e693a043f17da768ad0269c81e943882cb2f7f2185469d95aa9b26d3ba"} Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.911944 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-wxgcd" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.930082 4778 scope.go:117] "RemoveContainer" containerID="79703371c7ff4ccfeee833878d766f87ca0dc88ff8b21e6e4a1438e5d17d8d62" Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.944292 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxgcd"] Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.950567 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-wxgcd"] Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.956691 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv"] Sep 30 17:30:06 crc kubenswrapper[4778]: I0930 17:30:06.967657 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-kcflv"] Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072239 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74f4b8557d-jnwjq"] Sep 30 17:30:07 crc kubenswrapper[4778]: E0930 17:30:07.072485 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="637f30f0-8906-4b9b-bfa8-b356ee2f88d9" containerName="route-controller-manager" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072499 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="637f30f0-8906-4b9b-bfa8-b356ee2f88d9" containerName="route-controller-manager" Sep 30 17:30:07 crc kubenswrapper[4778]: E0930 17:30:07.072514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerName="controller-manager" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072520 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerName="controller-manager" Sep 30 17:30:07 crc kubenswrapper[4778]: E0930 17:30:07.072538 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ee697e-477a-4ac8-bfc8-7cd17364369d" containerName="collect-profiles" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072544 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ee697e-477a-4ac8-bfc8-7cd17364369d" containerName="collect-profiles" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072694 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ee697e-477a-4ac8-bfc8-7cd17364369d" containerName="collect-profiles" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072705 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="637f30f0-8906-4b9b-bfa8-b356ee2f88d9" containerName="route-controller-manager" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.072717 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" containerName="controller-manager" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.073115 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.076994 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.077075 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.077827 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.077855 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.077911 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.078121 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.084874 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.088918 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74f4b8557d-jnwjq"] Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.093801 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2"] Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.094630 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.097507 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.098071 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.098262 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.098808 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.099186 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.099678 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.109753 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmdd\" (UniqueName: \"kubernetes.io/projected/efbe1634-fdde-4bc1-9525-008440649196-kube-api-access-pmmdd\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.110013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-config\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.110153 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-client-ca\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.110292 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-proxy-ca-bundles\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.110404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efbe1634-fdde-4bc1-9525-008440649196-serving-cert\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.110810 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2"] Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.212076 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmdd\" (UniqueName: \"kubernetes.io/projected/efbe1634-fdde-4bc1-9525-008440649196-kube-api-access-pmmdd\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.212530 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba87158f-9331-474b-9e0e-d70ad8a12528-client-ca\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.212672 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba87158f-9331-474b-9e0e-d70ad8a12528-serving-cert\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.212753 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba87158f-9331-474b-9e0e-d70ad8a12528-config\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.212835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-config\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.212930 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkns\" (UniqueName: \"kubernetes.io/projected/ba87158f-9331-474b-9e0e-d70ad8a12528-kube-api-access-rqkns\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.213028 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-client-ca\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.213103 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-proxy-ca-bundles\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.213186 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efbe1634-fdde-4bc1-9525-008440649196-serving-cert\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.216806 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-config\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.217170 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-proxy-ca-bundles\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.217738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/efbe1634-fdde-4bc1-9525-008440649196-client-ca\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.218020 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efbe1634-fdde-4bc1-9525-008440649196-serving-cert\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.229938 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmdd\" (UniqueName: \"kubernetes.io/projected/efbe1634-fdde-4bc1-9525-008440649196-kube-api-access-pmmdd\") pod \"controller-manager-74f4b8557d-jnwjq\" (UID: \"efbe1634-fdde-4bc1-9525-008440649196\") " pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.314132 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba87158f-9331-474b-9e0e-d70ad8a12528-client-ca\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.314193 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba87158f-9331-474b-9e0e-d70ad8a12528-serving-cert\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.314219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba87158f-9331-474b-9e0e-d70ad8a12528-config\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.314271 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqkns\" (UniqueName: \"kubernetes.io/projected/ba87158f-9331-474b-9e0e-d70ad8a12528-kube-api-access-rqkns\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.315297 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ba87158f-9331-474b-9e0e-d70ad8a12528-client-ca\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.316374 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba87158f-9331-474b-9e0e-d70ad8a12528-config\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.318337 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba87158f-9331-474b-9e0e-d70ad8a12528-serving-cert\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.330883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqkns\" (UniqueName: \"kubernetes.io/projected/ba87158f-9331-474b-9e0e-d70ad8a12528-kube-api-access-rqkns\") pod \"route-controller-manager-6545d86f68-5vld2\" (UID: \"ba87158f-9331-474b-9e0e-d70ad8a12528\") " pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.395499 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.412447 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.722908 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="637f30f0-8906-4b9b-bfa8-b356ee2f88d9" path="/var/lib/kubelet/pods/637f30f0-8906-4b9b-bfa8-b356ee2f88d9/volumes" Sep 30 17:30:07 crc kubenswrapper[4778]: I0930 17:30:07.723455 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1da094b-f44d-4d9d-9fe8-3d19ea244d09" path="/var/lib/kubelet/pods/c1da094b-f44d-4d9d-9fe8-3d19ea244d09/volumes" Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.588869 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2"] Sep 30 17:30:10 crc kubenswrapper[4778]: W0930 17:30:10.599717 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba87158f_9331_474b_9e0e_d70ad8a12528.slice/crio-dd476b105257f8dc79c16b9acefe15fa219dfe7bed9676ba44a07535b3983bec WatchSource:0}: Error finding container dd476b105257f8dc79c16b9acefe15fa219dfe7bed9676ba44a07535b3983bec: Status 404 returned error can't find the container with id dd476b105257f8dc79c16b9acefe15fa219dfe7bed9676ba44a07535b3983bec Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.843803 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74f4b8557d-jnwjq"] Sep 30 17:30:10 crc kubenswrapper[4778]: W0930 17:30:10.852682 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefbe1634_fdde_4bc1_9525_008440649196.slice/crio-b9184215ba7065fb525295a1d1f36a6e71ebacdcd9449fbec557dba63e55de85 WatchSource:0}: Error finding container b9184215ba7065fb525295a1d1f36a6e71ebacdcd9449fbec557dba63e55de85: Status 404 returned error can't find the container with id b9184215ba7065fb525295a1d1f36a6e71ebacdcd9449fbec557dba63e55de85 Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.958282 4778 generic.go:334] "Generic (PLEG): container finished" podID="841759dc-7794-4849-8d7a-a16b1674d011" containerID="edd04a1af827f3e07ae2f7267008e5c6e51d934406f56cb133f772a99f8641dd" exitCode=0 Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.958818 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerDied","Data":"edd04a1af827f3e07ae2f7267008e5c6e51d934406f56cb133f772a99f8641dd"} Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.960798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" event={"ID":"efbe1634-fdde-4bc1-9525-008440649196","Type":"ContainerStarted","Data":"b9184215ba7065fb525295a1d1f36a6e71ebacdcd9449fbec557dba63e55de85"} Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.970365 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" event={"ID":"ba87158f-9331-474b-9e0e-d70ad8a12528","Type":"ContainerStarted","Data":"7675c7322001802fb4ef6ef338e17e6d66efffb0ccb10c27cf3d665a9461bf9f"} Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.970432 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" event={"ID":"ba87158f-9331-474b-9e0e-d70ad8a12528","Type":"ContainerStarted","Data":"dd476b105257f8dc79c16b9acefe15fa219dfe7bed9676ba44a07535b3983bec"} Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.971034 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.973661 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" event={"ID":"c2bdec16-ac5d-4456-9862-220cd1ee1d40","Type":"ContainerStarted","Data":"17ba49939529f6a3bd720ec51fccfae50ac3664019bf7a70a5a46e3d5ac31bbb"} Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.974929 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:10 crc kubenswrapper[4778]: I0930 17:30:10.991195 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" podStartSLOduration=3.991176819 podStartE2EDuration="3.991176819s" podCreationTimestamp="2025-09-30 17:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:30:10.987014336 +0000 UTC m=+749.976912139" watchObservedRunningTime="2025-09-30 17:30:10.991176819 +0000 UTC m=+749.981074622" Sep 30 17:30:11 crc kubenswrapper[4778]: I0930 17:30:11.008977 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" podStartSLOduration=1.875290517 podStartE2EDuration="11.008932998s" podCreationTimestamp="2025-09-30 17:30:00 +0000 UTC" firstStartedPulling="2025-09-30 17:30:01.285841448 +0000 UTC m=+740.275739251" lastFinishedPulling="2025-09-30 17:30:10.419483929 +0000 UTC m=+749.409381732" observedRunningTime="2025-09-30 17:30:11.008141503 +0000 UTC m=+749.998039306" watchObservedRunningTime="2025-09-30 17:30:11.008932998 +0000 UTC m=+749.998830791" Sep 30 17:30:11 crc kubenswrapper[4778]: I0930 17:30:11.243584 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6545d86f68-5vld2" Sep 30 17:30:11 crc kubenswrapper[4778]: I0930 17:30:11.382480 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-k845h" Sep 30 17:30:11 crc kubenswrapper[4778]: I0930 17:30:11.983784 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" event={"ID":"efbe1634-fdde-4bc1-9525-008440649196","Type":"ContainerStarted","Data":"fddfc44aef8dcf59256384cb4eb18bfb769e371d8d088cb57e419bb1a1fad22b"} Sep 30 17:30:12 crc kubenswrapper[4778]: I0930 17:30:12.064215 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" podStartSLOduration=5.062606135 podStartE2EDuration="5.062606135s" podCreationTimestamp="2025-09-30 17:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:30:12.060375484 +0000 UTC m=+751.050273307" watchObservedRunningTime="2025-09-30 17:30:12.062606135 +0000 UTC m=+751.052503948" Sep 30 17:30:12 crc kubenswrapper[4778]: I0930 17:30:12.272755 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-8h2sm" Sep 30 17:30:12 crc kubenswrapper[4778]: I0930 17:30:12.993357 4778 generic.go:334] "Generic (PLEG): container finished" podID="841759dc-7794-4849-8d7a-a16b1674d011" containerID="18a0646dcfd286408c0f1799362dfad8c70257b4e53eccfacc91c46a58feb7d7" exitCode=0 Sep 30 17:30:12 crc kubenswrapper[4778]: I0930 17:30:12.993575 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerDied","Data":"18a0646dcfd286408c0f1799362dfad8c70257b4e53eccfacc91c46a58feb7d7"} Sep 30 17:30:12 crc kubenswrapper[4778]: I0930 17:30:12.994275 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:12 crc kubenswrapper[4778]: I0930 17:30:12.999315 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74f4b8557d-jnwjq" Sep 30 17:30:14 crc kubenswrapper[4778]: I0930 17:30:14.004721 4778 generic.go:334] "Generic (PLEG): container finished" podID="841759dc-7794-4849-8d7a-a16b1674d011" containerID="3c263e1a23f75fbbedfd4120132e7eba6c150f33dbdc41d8b2ead395d15da83c" exitCode=0 Sep 30 17:30:14 crc kubenswrapper[4778]: I0930 17:30:14.004832 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerDied","Data":"3c263e1a23f75fbbedfd4120132e7eba6c150f33dbdc41d8b2ead395d15da83c"} Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.017447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"cd155ee4f2b8e6cc14ddab7001623b00b5ea36dcbdf8696a0b798dce2b53dfaf"} Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.018153 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"92bb7e89109f6f061f4373e1c1024884e74f48cc134701120178b213ef6767cd"} Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.018173 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"5a3d98298f661d3638694ce58e495f0aecb1cf749bb25694d346606d5967eaa1"} Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.018184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"1490a4957c8536a1b853306dfa643137c0984ac743dd47eb005437de44e1d548"} Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.018195 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"db7a8e635543da07b161b4859e26eb9fdcb7124568c16f2ddf91a220267d3da9"} Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.483247 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-jw4sf"] Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.484092 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.486455 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.486542 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.500888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jw4sf"] Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.541714 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhw8c\" (UniqueName: \"kubernetes.io/projected/01b868c5-bdf7-4286-acf8-3243b3f1f1b2-kube-api-access-nhw8c\") pod \"openstack-operator-index-jw4sf\" (UID: \"01b868c5-bdf7-4286-acf8-3243b3f1f1b2\") " pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.643361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhw8c\" (UniqueName: \"kubernetes.io/projected/01b868c5-bdf7-4286-acf8-3243b3f1f1b2-kube-api-access-nhw8c\") pod \"openstack-operator-index-jw4sf\" (UID: \"01b868c5-bdf7-4286-acf8-3243b3f1f1b2\") " pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.663784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhw8c\" (UniqueName: \"kubernetes.io/projected/01b868c5-bdf7-4286-acf8-3243b3f1f1b2-kube-api-access-nhw8c\") pod \"openstack-operator-index-jw4sf\" (UID: \"01b868c5-bdf7-4286-acf8-3243b3f1f1b2\") " pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:15 crc kubenswrapper[4778]: I0930 17:30:15.806659 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:16 crc kubenswrapper[4778]: I0930 17:30:16.031510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-26pmr" event={"ID":"841759dc-7794-4849-8d7a-a16b1674d011","Type":"ContainerStarted","Data":"597e9e1368694880098b9a83f3ddb3df96405ef1bc48e8aa517e85af52603bd4"} Sep 30 17:30:16 crc kubenswrapper[4778]: I0930 17:30:16.032102 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:16 crc kubenswrapper[4778]: I0930 17:30:16.064812 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-26pmr" podStartSLOduration=7.030716971 podStartE2EDuration="16.064776211s" podCreationTimestamp="2025-09-30 17:30:00 +0000 UTC" firstStartedPulling="2025-09-30 17:30:01.369024974 +0000 UTC m=+740.358922787" lastFinishedPulling="2025-09-30 17:30:10.403084234 +0000 UTC m=+749.392982027" observedRunningTime="2025-09-30 17:30:16.056042051 +0000 UTC m=+755.045939884" watchObservedRunningTime="2025-09-30 17:30:16.064776211 +0000 UTC m=+755.054674024" Sep 30 17:30:16 crc kubenswrapper[4778]: I0930 17:30:16.258575 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-jw4sf"] Sep 30 17:30:16 crc kubenswrapper[4778]: I0930 17:30:16.258875 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:16 crc kubenswrapper[4778]: I0930 17:30:16.323416 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:17 crc kubenswrapper[4778]: I0930 17:30:17.039477 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jw4sf" event={"ID":"01b868c5-bdf7-4286-acf8-3243b3f1f1b2","Type":"ContainerStarted","Data":"6024e5eb9861d3e62a62b9ca207ff343bfad20138dac32e042fd3f4003b448c1"} Sep 30 17:30:18 crc kubenswrapper[4778]: I0930 17:30:18.391137 4778 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 17:30:18 crc kubenswrapper[4778]: I0930 17:30:18.862926 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jw4sf"] Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.472670 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-d4xhf"] Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.474104 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.477601 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-8zwwh" Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.484873 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d4xhf"] Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.522322 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqhn6\" (UniqueName: \"kubernetes.io/projected/bcfc85f5-7dbe-4093-8177-a1413ebfbaca-kube-api-access-pqhn6\") pod \"openstack-operator-index-d4xhf\" (UID: \"bcfc85f5-7dbe-4093-8177-a1413ebfbaca\") " pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.624284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqhn6\" (UniqueName: \"kubernetes.io/projected/bcfc85f5-7dbe-4093-8177-a1413ebfbaca-kube-api-access-pqhn6\") pod \"openstack-operator-index-d4xhf\" (UID: \"bcfc85f5-7dbe-4093-8177-a1413ebfbaca\") " pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.657836 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqhn6\" (UniqueName: \"kubernetes.io/projected/bcfc85f5-7dbe-4093-8177-a1413ebfbaca-kube-api-access-pqhn6\") pod \"openstack-operator-index-d4xhf\" (UID: \"bcfc85f5-7dbe-4093-8177-a1413ebfbaca\") " pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:19 crc kubenswrapper[4778]: I0930 17:30:19.797873 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.069020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jw4sf" event={"ID":"01b868c5-bdf7-4286-acf8-3243b3f1f1b2","Type":"ContainerStarted","Data":"3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e"} Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.069181 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-jw4sf" podUID="01b868c5-bdf7-4286-acf8-3243b3f1f1b2" containerName="registry-server" containerID="cri-o://3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e" gracePeriod=2 Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.094004 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-jw4sf" podStartSLOduration=2.227073138 podStartE2EDuration="5.093978032s" podCreationTimestamp="2025-09-30 17:30:15 +0000 UTC" firstStartedPulling="2025-09-30 17:30:16.270364769 +0000 UTC m=+755.260262612" lastFinishedPulling="2025-09-30 17:30:19.137269693 +0000 UTC m=+758.127167506" observedRunningTime="2025-09-30 17:30:20.092710321 +0000 UTC m=+759.082608124" watchObservedRunningTime="2025-09-30 17:30:20.093978032 +0000 UTC m=+759.083875845" Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.281510 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-d4xhf"] Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.606506 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.638386 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhw8c\" (UniqueName: \"kubernetes.io/projected/01b868c5-bdf7-4286-acf8-3243b3f1f1b2-kube-api-access-nhw8c\") pod \"01b868c5-bdf7-4286-acf8-3243b3f1f1b2\" (UID: \"01b868c5-bdf7-4286-acf8-3243b3f1f1b2\") " Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.647608 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b868c5-bdf7-4286-acf8-3243b3f1f1b2-kube-api-access-nhw8c" (OuterVolumeSpecName: "kube-api-access-nhw8c") pod "01b868c5-bdf7-4286-acf8-3243b3f1f1b2" (UID: "01b868c5-bdf7-4286-acf8-3243b3f1f1b2"). InnerVolumeSpecName "kube-api-access-nhw8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.740768 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhw8c\" (UniqueName: \"kubernetes.io/projected/01b868c5-bdf7-4286-acf8-3243b3f1f1b2-kube-api-access-nhw8c\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:20 crc kubenswrapper[4778]: I0930 17:30:20.994418 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-p5b64" Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.078670 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jw4sf" event={"ID":"01b868c5-bdf7-4286-acf8-3243b3f1f1b2","Type":"ContainerDied","Data":"3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e"} Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.078790 4778 scope.go:117] "RemoveContainer" containerID="3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e" Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.078912 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-jw4sf" Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.078761 4778 generic.go:334] "Generic (PLEG): container finished" podID="01b868c5-bdf7-4286-acf8-3243b3f1f1b2" containerID="3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e" exitCode=0 Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.079217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-jw4sf" event={"ID":"01b868c5-bdf7-4286-acf8-3243b3f1f1b2","Type":"ContainerDied","Data":"6024e5eb9861d3e62a62b9ca207ff343bfad20138dac32e042fd3f4003b448c1"} Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.081418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d4xhf" event={"ID":"bcfc85f5-7dbe-4093-8177-a1413ebfbaca","Type":"ContainerStarted","Data":"58342ea49362582420719506b51056d1c532474c20f54f6c8426abff8c1abaf0"} Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.081479 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-d4xhf" event={"ID":"bcfc85f5-7dbe-4093-8177-a1413ebfbaca","Type":"ContainerStarted","Data":"cf89da1883e66d3fbe9847735bc2779010cd0702a9424e2726e55cc282471830"} Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.109610 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-d4xhf" podStartSLOduration=2.050586557 podStartE2EDuration="2.109573038s" podCreationTimestamp="2025-09-30 17:30:19 +0000 UTC" firstStartedPulling="2025-09-30 17:30:20.304204499 +0000 UTC m=+759.294102322" lastFinishedPulling="2025-09-30 17:30:20.36319101 +0000 UTC m=+759.353088803" observedRunningTime="2025-09-30 17:30:21.103850115 +0000 UTC m=+760.093747948" watchObservedRunningTime="2025-09-30 17:30:21.109573038 +0000 UTC m=+760.099470871" Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.109801 4778 scope.go:117] "RemoveContainer" containerID="3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e" Sep 30 17:30:21 crc kubenswrapper[4778]: E0930 17:30:21.112692 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e\": container with ID starting with 3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e not found: ID does not exist" containerID="3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e" Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.112787 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e"} err="failed to get container status \"3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e\": rpc error: code = NotFound desc = could not find container \"3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e\": container with ID starting with 3fc50de05ca8a4549cabe44f58ce72fec9698284da1b228cb8ebb220e4e7793e not found: ID does not exist" Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.130353 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-jw4sf"] Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.135113 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-jw4sf"] Sep 30 17:30:21 crc kubenswrapper[4778]: I0930 17:30:21.729587 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b868c5-bdf7-4286-acf8-3243b3f1f1b2" path="/var/lib/kubelet/pods/01b868c5-bdf7-4286-acf8-3243b3f1f1b2/volumes" Sep 30 17:30:29 crc kubenswrapper[4778]: I0930 17:30:29.798050 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:29 crc kubenswrapper[4778]: I0930 17:30:29.798987 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:29 crc kubenswrapper[4778]: I0930 17:30:29.838561 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:30 crc kubenswrapper[4778]: I0930 17:30:30.199842 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-d4xhf" Sep 30 17:30:31 crc kubenswrapper[4778]: I0930 17:30:31.261848 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-26pmr" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.363715 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb"] Sep 30 17:30:35 crc kubenswrapper[4778]: E0930 17:30:35.364246 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b868c5-bdf7-4286-acf8-3243b3f1f1b2" containerName="registry-server" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.364259 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b868c5-bdf7-4286-acf8-3243b3f1f1b2" containerName="registry-server" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.364363 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b868c5-bdf7-4286-acf8-3243b3f1f1b2" containerName="registry-server" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.365277 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.377723 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb"] Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.378439 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-w6tgt" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.484413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-bundle\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.484470 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65b5s\" (UniqueName: \"kubernetes.io/projected/87932d5f-678b-4115-8fa4-37a1fa008062-kube-api-access-65b5s\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.484507 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-util\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.586000 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-bundle\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.586097 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65b5s\" (UniqueName: \"kubernetes.io/projected/87932d5f-678b-4115-8fa4-37a1fa008062-kube-api-access-65b5s\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.586177 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-util\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.586721 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-bundle\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.587066 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-util\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.610986 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65b5s\" (UniqueName: \"kubernetes.io/projected/87932d5f-678b-4115-8fa4-37a1fa008062-kube-api-access-65b5s\") pod \"53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:35 crc kubenswrapper[4778]: I0930 17:30:35.690955 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:36 crc kubenswrapper[4778]: I0930 17:30:36.167713 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb"] Sep 30 17:30:36 crc kubenswrapper[4778]: I0930 17:30:36.205299 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" event={"ID":"87932d5f-678b-4115-8fa4-37a1fa008062","Type":"ContainerStarted","Data":"c35a390cd7d240d69f0393dc50446f6eafded371cda8b8f1ef7ac4bc4c8c8c96"} Sep 30 17:30:37 crc kubenswrapper[4778]: I0930 17:30:37.216542 4778 generic.go:334] "Generic (PLEG): container finished" podID="87932d5f-678b-4115-8fa4-37a1fa008062" containerID="49b69ac34a633c613178d8211daa53b0b2d493381b64fd439a5be6eb59cb8207" exitCode=0 Sep 30 17:30:37 crc kubenswrapper[4778]: I0930 17:30:37.216723 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" event={"ID":"87932d5f-678b-4115-8fa4-37a1fa008062","Type":"ContainerDied","Data":"49b69ac34a633c613178d8211daa53b0b2d493381b64fd439a5be6eb59cb8207"} Sep 30 17:30:38 crc kubenswrapper[4778]: I0930 17:30:38.231148 4778 generic.go:334] "Generic (PLEG): container finished" podID="87932d5f-678b-4115-8fa4-37a1fa008062" containerID="cb749fd3f97de52d6569ef7fe72e20b4f72ae1b7aac6d01ab5e874b8432f6744" exitCode=0 Sep 30 17:30:38 crc kubenswrapper[4778]: I0930 17:30:38.231240 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" event={"ID":"87932d5f-678b-4115-8fa4-37a1fa008062","Type":"ContainerDied","Data":"cb749fd3f97de52d6569ef7fe72e20b4f72ae1b7aac6d01ab5e874b8432f6744"} Sep 30 17:30:39 crc kubenswrapper[4778]: I0930 17:30:39.243251 4778 generic.go:334] "Generic (PLEG): container finished" podID="87932d5f-678b-4115-8fa4-37a1fa008062" containerID="8e50651a508f06cd4a94fa19e5e0296db4893c67e7ba5cc39600319f7a98f19f" exitCode=0 Sep 30 17:30:39 crc kubenswrapper[4778]: I0930 17:30:39.243383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" event={"ID":"87932d5f-678b-4115-8fa4-37a1fa008062","Type":"ContainerDied","Data":"8e50651a508f06cd4a94fa19e5e0296db4893c67e7ba5cc39600319f7a98f19f"} Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.654048 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.779728 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-65b5s\" (UniqueName: \"kubernetes.io/projected/87932d5f-678b-4115-8fa4-37a1fa008062-kube-api-access-65b5s\") pod \"87932d5f-678b-4115-8fa4-37a1fa008062\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.779959 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-util\") pod \"87932d5f-678b-4115-8fa4-37a1fa008062\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.780188 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-bundle\") pod \"87932d5f-678b-4115-8fa4-37a1fa008062\" (UID: \"87932d5f-678b-4115-8fa4-37a1fa008062\") " Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.783030 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-bundle" (OuterVolumeSpecName: "bundle") pod "87932d5f-678b-4115-8fa4-37a1fa008062" (UID: "87932d5f-678b-4115-8fa4-37a1fa008062"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.790054 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87932d5f-678b-4115-8fa4-37a1fa008062-kube-api-access-65b5s" (OuterVolumeSpecName: "kube-api-access-65b5s") pod "87932d5f-678b-4115-8fa4-37a1fa008062" (UID: "87932d5f-678b-4115-8fa4-37a1fa008062"). InnerVolumeSpecName "kube-api-access-65b5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.819119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-util" (OuterVolumeSpecName: "util") pod "87932d5f-678b-4115-8fa4-37a1fa008062" (UID: "87932d5f-678b-4115-8fa4-37a1fa008062"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.884136 4778 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.884196 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-65b5s\" (UniqueName: \"kubernetes.io/projected/87932d5f-678b-4115-8fa4-37a1fa008062-kube-api-access-65b5s\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:40 crc kubenswrapper[4778]: I0930 17:30:40.884215 4778 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/87932d5f-678b-4115-8fa4-37a1fa008062-util\") on node \"crc\" DevicePath \"\"" Sep 30 17:30:41 crc kubenswrapper[4778]: I0930 17:30:41.262723 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" event={"ID":"87932d5f-678b-4115-8fa4-37a1fa008062","Type":"ContainerDied","Data":"c35a390cd7d240d69f0393dc50446f6eafded371cda8b8f1ef7ac4bc4c8c8c96"} Sep 30 17:30:41 crc kubenswrapper[4778]: I0930 17:30:41.262772 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb" Sep 30 17:30:41 crc kubenswrapper[4778]: I0930 17:30:41.262804 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c35a390cd7d240d69f0393dc50446f6eafded371cda8b8f1ef7ac4bc4c8c8c96" Sep 30 17:30:44 crc kubenswrapper[4778]: I0930 17:30:44.812231 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:30:44 crc kubenswrapper[4778]: I0930 17:30:44.813350 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.838518 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn"] Sep 30 17:30:47 crc kubenswrapper[4778]: E0930 17:30:47.838892 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="util" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.838908 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="util" Sep 30 17:30:47 crc kubenswrapper[4778]: E0930 17:30:47.838917 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="extract" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.838924 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="extract" Sep 30 17:30:47 crc kubenswrapper[4778]: E0930 17:30:47.838946 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="pull" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.838955 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="pull" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.839110 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="87932d5f-678b-4115-8fa4-37a1fa008062" containerName="extract" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.839966 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.846415 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-prb8q" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.888396 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn"] Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.896235 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9hwm\" (UniqueName: \"kubernetes.io/projected/ded817e4-c278-4c72-8a31-2826f9a59292-kube-api-access-x9hwm\") pod \"openstack-operator-controller-operator-d8fdfd448-zzqnn\" (UID: \"ded817e4-c278-4c72-8a31-2826f9a59292\") " pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:30:47 crc kubenswrapper[4778]: I0930 17:30:47.997829 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9hwm\" (UniqueName: \"kubernetes.io/projected/ded817e4-c278-4c72-8a31-2826f9a59292-kube-api-access-x9hwm\") pod \"openstack-operator-controller-operator-d8fdfd448-zzqnn\" (UID: \"ded817e4-c278-4c72-8a31-2826f9a59292\") " pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:30:48 crc kubenswrapper[4778]: I0930 17:30:48.023180 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9hwm\" (UniqueName: \"kubernetes.io/projected/ded817e4-c278-4c72-8a31-2826f9a59292-kube-api-access-x9hwm\") pod \"openstack-operator-controller-operator-d8fdfd448-zzqnn\" (UID: \"ded817e4-c278-4c72-8a31-2826f9a59292\") " pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:30:48 crc kubenswrapper[4778]: I0930 17:30:48.160206 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:30:48 crc kubenswrapper[4778]: I0930 17:30:48.693690 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn"] Sep 30 17:30:49 crc kubenswrapper[4778]: I0930 17:30:49.347856 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" event={"ID":"ded817e4-c278-4c72-8a31-2826f9a59292","Type":"ContainerStarted","Data":"3c477e4cab282efe9b7bfe421d9e41909fcfd73bc3df2ea5d7b469f71be21195"} Sep 30 17:30:53 crc kubenswrapper[4778]: I0930 17:30:53.386905 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" event={"ID":"ded817e4-c278-4c72-8a31-2826f9a59292","Type":"ContainerStarted","Data":"ef6b3e7a3a0628478dd513c3d187fa17c833a813b6ad220803b7d260df386166"} Sep 30 17:30:56 crc kubenswrapper[4778]: I0930 17:30:56.417306 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" event={"ID":"ded817e4-c278-4c72-8a31-2826f9a59292","Type":"ContainerStarted","Data":"e114159043b4b5027d34393e1299d0f01fc4f0d3088955e2d65e2d809a18b745"} Sep 30 17:30:56 crc kubenswrapper[4778]: I0930 17:30:56.417778 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:30:56 crc kubenswrapper[4778]: I0930 17:30:56.456108 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" podStartSLOduration=2.92550459 podStartE2EDuration="9.456068587s" podCreationTimestamp="2025-09-30 17:30:47 +0000 UTC" firstStartedPulling="2025-09-30 17:30:48.703194626 +0000 UTC m=+787.693092429" lastFinishedPulling="2025-09-30 17:30:55.233758623 +0000 UTC m=+794.223656426" observedRunningTime="2025-09-30 17:30:56.449267545 +0000 UTC m=+795.439165388" watchObservedRunningTime="2025-09-30 17:30:56.456068587 +0000 UTC m=+795.445966440" Sep 30 17:30:57 crc kubenswrapper[4778]: I0930 17:30:57.426657 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-d8fdfd448-zzqnn" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.581315 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bcxpz"] Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.584207 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.605158 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcxpz"] Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.699736 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l98n\" (UniqueName: \"kubernetes.io/projected/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-kube-api-access-2l98n\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.700005 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-utilities\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.700089 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-catalog-content\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.801729 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l98n\" (UniqueName: \"kubernetes.io/projected/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-kube-api-access-2l98n\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.802109 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-utilities\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.802223 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-catalog-content\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.802828 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-catalog-content\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.803089 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-utilities\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.844780 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l98n\" (UniqueName: \"kubernetes.io/projected/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-kube-api-access-2l98n\") pod \"redhat-marketplace-bcxpz\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:06 crc kubenswrapper[4778]: I0930 17:31:06.911392 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:07 crc kubenswrapper[4778]: I0930 17:31:07.422944 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcxpz"] Sep 30 17:31:07 crc kubenswrapper[4778]: I0930 17:31:07.488525 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcxpz" event={"ID":"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b","Type":"ContainerStarted","Data":"9c3949234816e86bdebcfe0f463a5ce9beb697dc9810dedd845013d1192cbb8a"} Sep 30 17:31:08 crc kubenswrapper[4778]: I0930 17:31:08.517183 4778 generic.go:334] "Generic (PLEG): container finished" podID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerID="72f2f957e64e5a13361ad669f83ed6b96713d574c01475be5d0eef065608c357" exitCode=0 Sep 30 17:31:08 crc kubenswrapper[4778]: I0930 17:31:08.517249 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcxpz" event={"ID":"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b","Type":"ContainerDied","Data":"72f2f957e64e5a13361ad669f83ed6b96713d574c01475be5d0eef065608c357"} Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.533543 4778 generic.go:334] "Generic (PLEG): container finished" podID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerID="cadd4f81304b7077cefbdb6c4d4390f69ac9d1a04d9023cc8daa1e6778b5dbd3" exitCode=0 Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.533663 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcxpz" event={"ID":"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b","Type":"ContainerDied","Data":"cadd4f81304b7077cefbdb6c4d4390f69ac9d1a04d9023cc8daa1e6778b5dbd3"} Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.549909 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2s94l"] Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.558525 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.573047 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2s94l"] Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.658313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-utilities\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.658391 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghsjc\" (UniqueName: \"kubernetes.io/projected/63657eef-94dd-4312-9807-6214904375ed-kube-api-access-ghsjc\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.658422 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-catalog-content\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.760004 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-utilities\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.760054 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-catalog-content\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.760070 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghsjc\" (UniqueName: \"kubernetes.io/projected/63657eef-94dd-4312-9807-6214904375ed-kube-api-access-ghsjc\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.760724 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-catalog-content\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.760793 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-utilities\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.784957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghsjc\" (UniqueName: \"kubernetes.io/projected/63657eef-94dd-4312-9807-6214904375ed-kube-api-access-ghsjc\") pod \"redhat-operators-2s94l\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:10 crc kubenswrapper[4778]: I0930 17:31:10.931498 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:11 crc kubenswrapper[4778]: I0930 17:31:11.389191 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2s94l"] Sep 30 17:31:11 crc kubenswrapper[4778]: I0930 17:31:11.540346 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcxpz" event={"ID":"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b","Type":"ContainerStarted","Data":"928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712"} Sep 30 17:31:11 crc kubenswrapper[4778]: I0930 17:31:11.541905 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerStarted","Data":"ff2f767c2a105e1f0f32194ad9ce7bcc86253f284b5cbd656df1c0ca4ecb0d62"} Sep 30 17:31:11 crc kubenswrapper[4778]: I0930 17:31:11.541930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerStarted","Data":"2e372cec492d058d4710cdc44105debd389fbe9abfec7f00f82c703ebd51be32"} Sep 30 17:31:11 crc kubenswrapper[4778]: I0930 17:31:11.560381 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bcxpz" podStartSLOduration=3.072697876 podStartE2EDuration="5.560363342s" podCreationTimestamp="2025-09-30 17:31:06 +0000 UTC" firstStartedPulling="2025-09-30 17:31:08.51948304 +0000 UTC m=+807.509380883" lastFinishedPulling="2025-09-30 17:31:11.007148556 +0000 UTC m=+809.997046349" observedRunningTime="2025-09-30 17:31:11.556815716 +0000 UTC m=+810.546713529" watchObservedRunningTime="2025-09-30 17:31:11.560363342 +0000 UTC m=+810.550261145" Sep 30 17:31:12 crc kubenswrapper[4778]: I0930 17:31:12.551975 4778 generic.go:334] "Generic (PLEG): container finished" podID="63657eef-94dd-4312-9807-6214904375ed" containerID="ff2f767c2a105e1f0f32194ad9ce7bcc86253f284b5cbd656df1c0ca4ecb0d62" exitCode=0 Sep 30 17:31:12 crc kubenswrapper[4778]: I0930 17:31:12.553126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerDied","Data":"ff2f767c2a105e1f0f32194ad9ce7bcc86253f284b5cbd656df1c0ca4ecb0d62"} Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.017916 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.019164 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.022194 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-g44f6" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.038815 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.042802 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.043987 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.046378 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-2862f" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.056051 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.057658 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.059660 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.063563 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-ggcbc" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.078170 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.087523 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.088516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.093615 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-sf4lz" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.094332 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.095667 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.102357 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-n4xx7" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.121524 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.122538 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q2mz\" (UniqueName: \"kubernetes.io/projected/08b975a6-8387-4c2b-ab76-c34f30ac2f02-kube-api-access-4q2mz\") pod \"cinder-operator-controller-manager-644bddb6d8-49dlx\" (UID: \"08b975a6-8387-4c2b-ab76-c34f30ac2f02\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.122697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qvn\" (UniqueName: \"kubernetes.io/projected/d0633e5b-e292-47b8-81e6-b752204748a9-kube-api-access-g5qvn\") pod \"barbican-operator-controller-manager-6ff8b75857-8w47p\" (UID: \"d0633e5b-e292-47b8-81e6-b752204748a9\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.127727 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.128664 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.130426 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.135089 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-jmnmx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.139847 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.179503 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.180520 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.180683 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.181572 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.183638 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.185445 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7sj55" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.185922 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-hdxc6" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.195174 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.198425 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.203030 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nj7jt" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.215403 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbdpl\" (UniqueName: \"kubernetes.io/projected/e84606fe-316d-493e-8159-d9707c2f2a47-kube-api-access-hbdpl\") pod \"glance-operator-controller-manager-84958c4d49-84vkl\" (UID: \"e84606fe-316d-493e-8159-d9707c2f2a47\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224110 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4q2mz\" (UniqueName: \"kubernetes.io/projected/08b975a6-8387-4c2b-ab76-c34f30ac2f02-kube-api-access-4q2mz\") pod \"cinder-operator-controller-manager-644bddb6d8-49dlx\" (UID: \"08b975a6-8387-4c2b-ab76-c34f30ac2f02\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224147 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qvn\" (UniqueName: \"kubernetes.io/projected/d0633e5b-e292-47b8-81e6-b752204748a9-kube-api-access-g5qvn\") pod \"barbican-operator-controller-manager-6ff8b75857-8w47p\" (UID: \"d0633e5b-e292-47b8-81e6-b752204748a9\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224196 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcm9\" (UniqueName: \"kubernetes.io/projected/66ff4558-e436-4870-a8b0-61124b0322f7-kube-api-access-8kcm9\") pod \"horizon-operator-controller-manager-9f4696d94-jhqnp\" (UID: \"66ff4558-e436-4870-a8b0-61124b0322f7\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dws8l\" (UniqueName: \"kubernetes.io/projected/62a10d57-e47f-45de-9588-a0abb103b727-kube-api-access-dws8l\") pod \"heat-operator-controller-manager-5d889d78cf-rhgbx\" (UID: \"62a10d57-e47f-45de-9588-a0abb103b727\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224304 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8p7l\" (UniqueName: \"kubernetes.io/projected/9235e0ee-34ca-4be2-af88-ef695afe5224-kube-api-access-n8p7l\") pod \"designate-operator-controller-manager-84f4f7b77b-28qfs\" (UID: \"9235e0ee-34ca-4be2-af88-ef695afe5224\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.224527 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.226778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.233513 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-c9s6w" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.246345 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.257024 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.257771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qvn\" (UniqueName: \"kubernetes.io/projected/d0633e5b-e292-47b8-81e6-b752204748a9-kube-api-access-g5qvn\") pod \"barbican-operator-controller-manager-6ff8b75857-8w47p\" (UID: \"d0633e5b-e292-47b8-81e6-b752204748a9\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.266202 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q2mz\" (UniqueName: \"kubernetes.io/projected/08b975a6-8387-4c2b-ab76-c34f30ac2f02-kube-api-access-4q2mz\") pod \"cinder-operator-controller-manager-644bddb6d8-49dlx\" (UID: \"08b975a6-8387-4c2b-ab76-c34f30ac2f02\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.278674 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.293656 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.295544 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.300926 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-slcjx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.329084 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334150 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dws8l\" (UniqueName: \"kubernetes.io/projected/62a10d57-e47f-45de-9588-a0abb103b727-kube-api-access-dws8l\") pod \"heat-operator-controller-manager-5d889d78cf-rhgbx\" (UID: \"62a10d57-e47f-45de-9588-a0abb103b727\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334329 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8p7l\" (UniqueName: \"kubernetes.io/projected/9235e0ee-34ca-4be2-af88-ef695afe5224-kube-api-access-n8p7l\") pod \"designate-operator-controller-manager-84f4f7b77b-28qfs\" (UID: \"9235e0ee-34ca-4be2-af88-ef695afe5224\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334482 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nzss\" (UniqueName: \"kubernetes.io/projected/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-kube-api-access-5nzss\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhjsb\" (UniqueName: \"kubernetes.io/projected/0bbfd0e9-af02-499b-98eb-c27f5eaed971-kube-api-access-qhjsb\") pod \"ironic-operator-controller-manager-7975b88857-w69kc\" (UID: \"0bbfd0e9-af02-499b-98eb-c27f5eaed971\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334702 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-cert\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbdpl\" (UniqueName: \"kubernetes.io/projected/e84606fe-316d-493e-8159-d9707c2f2a47-kube-api-access-hbdpl\") pod \"glance-operator-controller-manager-84958c4d49-84vkl\" (UID: \"e84606fe-316d-493e-8159-d9707c2f2a47\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.334895 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnfl\" (UniqueName: \"kubernetes.io/projected/849e7052-8c4b-469c-8dbe-3d6d3099ed7d-kube-api-access-dnnfl\") pod \"manila-operator-controller-manager-6d68dbc695-ldb8b\" (UID: \"849e7052-8c4b-469c-8dbe-3d6d3099ed7d\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.335107 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcm9\" (UniqueName: \"kubernetes.io/projected/66ff4558-e436-4870-a8b0-61124b0322f7-kube-api-access-8kcm9\") pod \"horizon-operator-controller-manager-9f4696d94-jhqnp\" (UID: \"66ff4558-e436-4870-a8b0-61124b0322f7\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.335242 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bbkc\" (UniqueName: \"kubernetes.io/projected/6ecf488a-bace-4ef9-bec7-aa29f7dd85e8-kube-api-access-9bbkc\") pod \"keystone-operator-controller-manager-5bd55b4bff-jn9dq\" (UID: \"6ecf488a-bace-4ef9-bec7-aa29f7dd85e8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.368001 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.385606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbdpl\" (UniqueName: \"kubernetes.io/projected/e84606fe-316d-493e-8159-d9707c2f2a47-kube-api-access-hbdpl\") pod \"glance-operator-controller-manager-84958c4d49-84vkl\" (UID: \"e84606fe-316d-493e-8159-d9707c2f2a47\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.387454 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.410527 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.411203 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.411643 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-xkkbq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.412444 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcm9\" (UniqueName: \"kubernetes.io/projected/66ff4558-e436-4870-a8b0-61124b0322f7-kube-api-access-8kcm9\") pod \"horizon-operator-controller-manager-9f4696d94-jhqnp\" (UID: \"66ff4558-e436-4870-a8b0-61124b0322f7\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.419258 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dws8l\" (UniqueName: \"kubernetes.io/projected/62a10d57-e47f-45de-9588-a0abb103b727-kube-api-access-dws8l\") pod \"heat-operator-controller-manager-5d889d78cf-rhgbx\" (UID: \"62a10d57-e47f-45de-9588-a0abb103b727\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.433125 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8p7l\" (UniqueName: \"kubernetes.io/projected/9235e0ee-34ca-4be2-af88-ef695afe5224-kube-api-access-n8p7l\") pod \"designate-operator-controller-manager-84f4f7b77b-28qfs\" (UID: \"9235e0ee-34ca-4be2-af88-ef695afe5224\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.435998 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449108 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-cert\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449179 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xdc5\" (UniqueName: \"kubernetes.io/projected/7ae19188-a950-4136-b02f-264588920c60-kube-api-access-6xdc5\") pod \"mariadb-operator-controller-manager-88c7-x9xzq\" (UID: \"7ae19188-a950-4136-b02f-264588920c60\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449218 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnfl\" (UniqueName: \"kubernetes.io/projected/849e7052-8c4b-469c-8dbe-3d6d3099ed7d-kube-api-access-dnnfl\") pod \"manila-operator-controller-manager-6d68dbc695-ldb8b\" (UID: \"849e7052-8c4b-469c-8dbe-3d6d3099ed7d\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449244 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bbkc\" (UniqueName: \"kubernetes.io/projected/6ecf488a-bace-4ef9-bec7-aa29f7dd85e8-kube-api-access-9bbkc\") pod \"keystone-operator-controller-manager-5bd55b4bff-jn9dq\" (UID: \"6ecf488a-bace-4ef9-bec7-aa29f7dd85e8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449298 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nzss\" (UniqueName: \"kubernetes.io/projected/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-kube-api-access-5nzss\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: E0930 17:31:13.449300 4778 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449326 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhjsb\" (UniqueName: \"kubernetes.io/projected/0bbfd0e9-af02-499b-98eb-c27f5eaed971-kube-api-access-qhjsb\") pod \"ironic-operator-controller-manager-7975b88857-w69kc\" (UID: \"0bbfd0e9-af02-499b-98eb-c27f5eaed971\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.449351 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6phq\" (UniqueName: \"kubernetes.io/projected/3fbf26ad-2f08-4506-8d77-c0162b8792f5-kube-api-access-n6phq\") pod \"neutron-operator-controller-manager-64d7b59854-n7ssv\" (UID: \"3fbf26ad-2f08-4506-8d77-c0162b8792f5\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:13 crc kubenswrapper[4778]: E0930 17:31:13.449386 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-cert podName:aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5 nodeName:}" failed. No retries permitted until 2025-09-30 17:31:13.949364595 +0000 UTC m=+812.939262398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-cert") pod "infra-operator-controller-manager-7d857cc749-kx62l" (UID: "aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5") : secret "infra-operator-webhook-server-cert" not found Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.458300 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.472467 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.475868 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.477051 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.478021 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhjsb\" (UniqueName: \"kubernetes.io/projected/0bbfd0e9-af02-499b-98eb-c27f5eaed971-kube-api-access-qhjsb\") pod \"ironic-operator-controller-manager-7975b88857-w69kc\" (UID: \"0bbfd0e9-af02-499b-98eb-c27f5eaed971\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.482462 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-28qzq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.483832 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nzss\" (UniqueName: \"kubernetes.io/projected/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-kube-api-access-5nzss\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.484069 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.484872 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bbkc\" (UniqueName: \"kubernetes.io/projected/6ecf488a-bace-4ef9-bec7-aa29f7dd85e8-kube-api-access-9bbkc\") pod \"keystone-operator-controller-manager-5bd55b4bff-jn9dq\" (UID: \"6ecf488a-bace-4ef9-bec7-aa29f7dd85e8\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.492259 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnfl\" (UniqueName: \"kubernetes.io/projected/849e7052-8c4b-469c-8dbe-3d6d3099ed7d-kube-api-access-dnnfl\") pod \"manila-operator-controller-manager-6d68dbc695-ldb8b\" (UID: \"849e7052-8c4b-469c-8dbe-3d6d3099ed7d\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.493819 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.495834 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jtnxl" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.503394 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.513189 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.522355 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.524897 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.525999 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.531391 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.533028 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.533277 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.534158 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.538503 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-nsrk8" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.538633 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.538736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.545289 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.545476 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.547373 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-dbs5l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.548020 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-drd8v" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.548457 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.550067 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.551069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xdc5\" (UniqueName: \"kubernetes.io/projected/7ae19188-a950-4136-b02f-264588920c60-kube-api-access-6xdc5\") pod \"mariadb-operator-controller-manager-88c7-x9xzq\" (UID: \"7ae19188-a950-4136-b02f-264588920c60\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.551159 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6phq\" (UniqueName: \"kubernetes.io/projected/3fbf26ad-2f08-4506-8d77-c0162b8792f5-kube-api-access-n6phq\") pod \"neutron-operator-controller-manager-64d7b59854-n7ssv\" (UID: \"3fbf26ad-2f08-4506-8d77-c0162b8792f5\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.553450 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-djhht" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.563586 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.573796 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6phq\" (UniqueName: \"kubernetes.io/projected/3fbf26ad-2f08-4506-8d77-c0162b8792f5-kube-api-access-n6phq\") pod \"neutron-operator-controller-manager-64d7b59854-n7ssv\" (UID: \"3fbf26ad-2f08-4506-8d77-c0162b8792f5\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.580300 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.581163 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xdc5\" (UniqueName: \"kubernetes.io/projected/7ae19188-a950-4136-b02f-264588920c60-kube-api-access-6xdc5\") pod \"mariadb-operator-controller-manager-88c7-x9xzq\" (UID: \"7ae19188-a950-4136-b02f-264588920c60\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.589429 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.591351 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.598286 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-4hz5g" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.602519 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.608476 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.624688 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.629976 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.637223 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-vb8w6" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.642712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vzfk\" (UniqueName: \"kubernetes.io/projected/77616b28-3707-4b13-a2ca-efc265a63676-kube-api-access-5vzfk\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77616b28-3707-4b13-a2ca-efc265a63676-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653081 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pm8\" (UniqueName: \"kubernetes.io/projected/89a96b51-b353-47e4-8242-9de93451c210-kube-api-access-p7pm8\") pod \"octavia-operator-controller-manager-76fcc6dc7c-ztxmv\" (UID: \"89a96b51-b353-47e4-8242-9de93451c210\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653177 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckhpn\" (UniqueName: \"kubernetes.io/projected/b159e466-544e-47b1-9617-d1ffcec28b1c-kube-api-access-ckhpn\") pod \"swift-operator-controller-manager-bc7dc7bd9-vtw6g\" (UID: \"b159e466-544e-47b1-9617-d1ffcec28b1c\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653216 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6vhz\" (UniqueName: \"kubernetes.io/projected/60314fa2-575c-42dc-beb7-d37d3ba69cac-kube-api-access-x6vhz\") pod \"placement-operator-controller-manager-589c58c6c-vfvgx\" (UID: \"60314fa2-575c-42dc-beb7-d37d3ba69cac\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g427v\" (UniqueName: \"kubernetes.io/projected/3fd2b550-65c1-4163-b735-2e517b37c34c-kube-api-access-g427v\") pod \"ovn-operator-controller-manager-9976ff44c-m4rzv\" (UID: \"3fd2b550-65c1-4163-b735-2e517b37c34c\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.653409 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2hcn\" (UniqueName: \"kubernetes.io/projected/4cdf1c9f-3579-41f3-9a10-05c1c8a5f241-kube-api-access-g2hcn\") pod \"nova-operator-controller-manager-c7c776c96-9s9jn\" (UID: \"4cdf1c9f-3579-41f3-9a10-05c1c8a5f241\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.661496 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.678041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.698919 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.709952 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.715298 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vhfjd" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.716072 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.754856 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vzfk\" (UniqueName: \"kubernetes.io/projected/77616b28-3707-4b13-a2ca-efc265a63676-kube-api-access-5vzfk\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.754916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77616b28-3707-4b13-a2ca-efc265a63676-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.754953 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pm8\" (UniqueName: \"kubernetes.io/projected/89a96b51-b353-47e4-8242-9de93451c210-kube-api-access-p7pm8\") pod \"octavia-operator-controller-manager-76fcc6dc7c-ztxmv\" (UID: \"89a96b51-b353-47e4-8242-9de93451c210\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.755001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvznt\" (UniqueName: \"kubernetes.io/projected/9f70afd9-8a72-4776-8586-6afee1834e3f-kube-api-access-pvznt\") pod \"telemetry-operator-controller-manager-7bdb6cfb74-8hfcx\" (UID: \"9f70afd9-8a72-4776-8586-6afee1834e3f\") " pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.755027 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckhpn\" (UniqueName: \"kubernetes.io/projected/b159e466-544e-47b1-9617-d1ffcec28b1c-kube-api-access-ckhpn\") pod \"swift-operator-controller-manager-bc7dc7bd9-vtw6g\" (UID: \"b159e466-544e-47b1-9617-d1ffcec28b1c\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.755052 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6vhz\" (UniqueName: \"kubernetes.io/projected/60314fa2-575c-42dc-beb7-d37d3ba69cac-kube-api-access-x6vhz\") pod \"placement-operator-controller-manager-589c58c6c-vfvgx\" (UID: \"60314fa2-575c-42dc-beb7-d37d3ba69cac\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.755076 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj26x\" (UniqueName: \"kubernetes.io/projected/c32b9788-c6d5-43ae-a777-ce4aeb3cabf0-kube-api-access-mj26x\") pod \"test-operator-controller-manager-f66b554c6-j4dhd\" (UID: \"c32b9788-c6d5-43ae-a777-ce4aeb3cabf0\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.755120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g427v\" (UniqueName: \"kubernetes.io/projected/3fd2b550-65c1-4163-b735-2e517b37c34c-kube-api-access-g427v\") pod \"ovn-operator-controller-manager-9976ff44c-m4rzv\" (UID: \"3fd2b550-65c1-4163-b735-2e517b37c34c\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.755199 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2hcn\" (UniqueName: \"kubernetes.io/projected/4cdf1c9f-3579-41f3-9a10-05c1c8a5f241-kube-api-access-g2hcn\") pod \"nova-operator-controller-manager-c7c776c96-9s9jn\" (UID: \"4cdf1c9f-3579-41f3-9a10-05c1c8a5f241\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:13 crc kubenswrapper[4778]: E0930 17:31:13.755916 4778 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:31:13 crc kubenswrapper[4778]: E0930 17:31:13.755991 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/77616b28-3707-4b13-a2ca-efc265a63676-cert podName:77616b28-3707-4b13-a2ca-efc265a63676 nodeName:}" failed. No retries permitted until 2025-09-30 17:31:14.255969163 +0000 UTC m=+813.245866966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/77616b28-3707-4b13-a2ca-efc265a63676-cert") pod "openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" (UID: "77616b28-3707-4b13-a2ca-efc265a63676") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.798883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vzfk\" (UniqueName: \"kubernetes.io/projected/77616b28-3707-4b13-a2ca-efc265a63676-kube-api-access-5vzfk\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.799260 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pm8\" (UniqueName: \"kubernetes.io/projected/89a96b51-b353-47e4-8242-9de93451c210-kube-api-access-p7pm8\") pod \"octavia-operator-controller-manager-76fcc6dc7c-ztxmv\" (UID: \"89a96b51-b353-47e4-8242-9de93451c210\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.799749 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6vhz\" (UniqueName: \"kubernetes.io/projected/60314fa2-575c-42dc-beb7-d37d3ba69cac-kube-api-access-x6vhz\") pod \"placement-operator-controller-manager-589c58c6c-vfvgx\" (UID: \"60314fa2-575c-42dc-beb7-d37d3ba69cac\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.803799 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckhpn\" (UniqueName: \"kubernetes.io/projected/b159e466-544e-47b1-9617-d1ffcec28b1c-kube-api-access-ckhpn\") pod \"swift-operator-controller-manager-bc7dc7bd9-vtw6g\" (UID: \"b159e466-544e-47b1-9617-d1ffcec28b1c\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.804482 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g427v\" (UniqueName: \"kubernetes.io/projected/3fd2b550-65c1-4163-b735-2e517b37c34c-kube-api-access-g427v\") pod \"ovn-operator-controller-manager-9976ff44c-m4rzv\" (UID: \"3fd2b550-65c1-4163-b735-2e517b37c34c\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.817907 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.820119 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2hcn\" (UniqueName: \"kubernetes.io/projected/4cdf1c9f-3579-41f3-9a10-05c1c8a5f241-kube-api-access-g2hcn\") pod \"nova-operator-controller-manager-c7c776c96-9s9jn\" (UID: \"4cdf1c9f-3579-41f3-9a10-05c1c8a5f241\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.820698 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.838610 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.858754 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvznt\" (UniqueName: \"kubernetes.io/projected/9f70afd9-8a72-4776-8586-6afee1834e3f-kube-api-access-pvznt\") pod \"telemetry-operator-controller-manager-7bdb6cfb74-8hfcx\" (UID: \"9f70afd9-8a72-4776-8586-6afee1834e3f\") " pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.858801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj26x\" (UniqueName: \"kubernetes.io/projected/c32b9788-c6d5-43ae-a777-ce4aeb3cabf0-kube-api-access-mj26x\") pod \"test-operator-controller-manager-f66b554c6-j4dhd\" (UID: \"c32b9788-c6d5-43ae-a777-ce4aeb3cabf0\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.858829 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgd4j\" (UniqueName: \"kubernetes.io/projected/84123306-eb7b-45d1-b542-f1f831949fb4-kube-api-access-tgd4j\") pod \"watcher-operator-controller-manager-76669f99c-z6mzw\" (UID: \"84123306-eb7b-45d1-b542-f1f831949fb4\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.885837 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.888608 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.888678 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.889809 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.889829 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.890760 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvznt\" (UniqueName: \"kubernetes.io/projected/9f70afd9-8a72-4776-8586-6afee1834e3f-kube-api-access-pvznt\") pod \"telemetry-operator-controller-manager-7bdb6cfb74-8hfcx\" (UID: \"9f70afd9-8a72-4776-8586-6afee1834e3f\") " pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.890919 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.891166 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.896663 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-j4phz" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.897173 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.897305 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-dxrs7" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.897762 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj26x\" (UniqueName: \"kubernetes.io/projected/c32b9788-c6d5-43ae-a777-ce4aeb3cabf0-kube-api-access-mj26x\") pod \"test-operator-controller-manager-f66b554c6-j4dhd\" (UID: \"c32b9788-c6d5-43ae-a777-ce4aeb3cabf0\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.903245 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk"] Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.907704 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.948707 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.963893 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-cert\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.963949 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.964000 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk2k6\" (UniqueName: \"kubernetes.io/projected/bcfb480c-3a1f-4a9d-83a0-183c46be742a-kube-api-access-vk2k6\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.964052 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrj4t\" (UniqueName: \"kubernetes.io/projected/dd54c570-121e-4daa-b264-fbb40a606478-kube-api-access-qrj4t\") pod \"rabbitmq-cluster-operator-manager-79d8469568-hrfgk\" (UID: \"dd54c570-121e-4daa-b264-fbb40a606478\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.964079 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgd4j\" (UniqueName: \"kubernetes.io/projected/84123306-eb7b-45d1-b542-f1f831949fb4-kube-api-access-tgd4j\") pod \"watcher-operator-controller-manager-76669f99c-z6mzw\" (UID: \"84123306-eb7b-45d1-b542-f1f831949fb4\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.967803 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.980100 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5-cert\") pod \"infra-operator-controller-manager-7d857cc749-kx62l\" (UID: \"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:13 crc kubenswrapper[4778]: I0930 17:31:13.997922 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.000778 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgd4j\" (UniqueName: \"kubernetes.io/projected/84123306-eb7b-45d1-b542-f1f831949fb4-kube-api-access-tgd4j\") pod \"watcher-operator-controller-manager-76669f99c-z6mzw\" (UID: \"84123306-eb7b-45d1-b542-f1f831949fb4\") " pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.073024 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.073124 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk2k6\" (UniqueName: \"kubernetes.io/projected/bcfb480c-3a1f-4a9d-83a0-183c46be742a-kube-api-access-vk2k6\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.073182 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrj4t\" (UniqueName: \"kubernetes.io/projected/dd54c570-121e-4daa-b264-fbb40a606478-kube-api-access-qrj4t\") pod \"rabbitmq-cluster-operator-manager-79d8469568-hrfgk\" (UID: \"dd54c570-121e-4daa-b264-fbb40a606478\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" Sep 30 17:31:14 crc kubenswrapper[4778]: E0930 17:31:14.073381 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 17:31:14 crc kubenswrapper[4778]: E0930 17:31:14.073468 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert podName:bcfb480c-3a1f-4a9d-83a0-183c46be742a nodeName:}" failed. No retries permitted until 2025-09-30 17:31:14.573446643 +0000 UTC m=+813.563344436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert") pod "openstack-operator-controller-manager-5468b64689-5nf52" (UID: "bcfb480c-3a1f-4a9d-83a0-183c46be742a") : secret "webhook-server-cert" not found Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.095534 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk2k6\" (UniqueName: \"kubernetes.io/projected/bcfb480c-3a1f-4a9d-83a0-183c46be742a-kube-api-access-vk2k6\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.096342 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrj4t\" (UniqueName: \"kubernetes.io/projected/dd54c570-121e-4daa-b264-fbb40a606478-kube-api-access-qrj4t\") pod \"rabbitmq-cluster-operator-manager-79d8469568-hrfgk\" (UID: \"dd54c570-121e-4daa-b264-fbb40a606478\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.105938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.156812 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.253107 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.263924 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx"] Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.270831 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p"] Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.276525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77616b28-3707-4b13-a2ca-efc265a63676-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.292114 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77616b28-3707-4b13-a2ca-efc265a63676-cert\") pod \"openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx\" (UID: \"77616b28-3707-4b13-a2ca-efc265a63676\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.479964 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.583807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:14 crc kubenswrapper[4778]: E0930 17:31:14.584522 4778 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 17:31:14 crc kubenswrapper[4778]: E0930 17:31:14.584596 4778 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert podName:bcfb480c-3a1f-4a9d-83a0-183c46be742a nodeName:}" failed. No retries permitted until 2025-09-30 17:31:15.584571631 +0000 UTC m=+814.574469434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert") pod "openstack-operator-controller-manager-5468b64689-5nf52" (UID: "bcfb480c-3a1f-4a9d-83a0-183c46be742a") : secret "webhook-server-cert" not found Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.618801 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" event={"ID":"08b975a6-8387-4c2b-ab76-c34f30ac2f02","Type":"ContainerStarted","Data":"145f9dd33eb6d6dfa4fa0dfc5e37bb63db84f14aef0079d4ee002dcdc206bef7"} Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.635923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" event={"ID":"d0633e5b-e292-47b8-81e6-b752204748a9","Type":"ContainerStarted","Data":"538cc160e9da3ca9ce336c1cb495a487ae26cc26bfdf96439104987ade15f0ff"} Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.652791 4778 generic.go:334] "Generic (PLEG): container finished" podID="63657eef-94dd-4312-9807-6214904375ed" containerID="0d3d9ac64691ff3260d9a7e46bb3a2de3bd1bba80ccdb13646888ff2adb3c6cd" exitCode=0 Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.652841 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerDied","Data":"0d3d9ac64691ff3260d9a7e46bb3a2de3bd1bba80ccdb13646888ff2adb3c6cd"} Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.812097 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.812145 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.868533 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl"] Sep 30 17:31:14 crc kubenswrapper[4778]: W0930 17:31:14.872438 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode84606fe_316d_493e_8159_d9707c2f2a47.slice/crio-6e8553e6d6c04bf492b032d1e25904915e31cb174fb3fee89749c3d0bd6d5435 WatchSource:0}: Error finding container 6e8553e6d6c04bf492b032d1e25904915e31cb174fb3fee89749c3d0bd6d5435: Status 404 returned error can't find the container with id 6e8553e6d6c04bf492b032d1e25904915e31cb174fb3fee89749c3d0bd6d5435 Sep 30 17:31:14 crc kubenswrapper[4778]: I0930 17:31:14.911911 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp"] Sep 30 17:31:14 crc kubenswrapper[4778]: W0930 17:31:14.916865 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66ff4558_e436_4870_a8b0_61124b0322f7.slice/crio-96fa631e8a691263c0039cd034f853dc81651b2a338939d6d280bc7c6acb4828 WatchSource:0}: Error finding container 96fa631e8a691263c0039cd034f853dc81651b2a338939d6d280bc7c6acb4828: Status 404 returned error can't find the container with id 96fa631e8a691263c0039cd034f853dc81651b2a338939d6d280bc7c6acb4828 Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.258525 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.272795 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.286636 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.301964 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.306930 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq"] Sep 30 17:31:15 crc kubenswrapper[4778]: W0930 17:31:15.358494 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod849e7052_8c4b_469c_8dbe_3d6d3099ed7d.slice/crio-8f32c9ef6108ab661128951d3e1115bf648990351307b7ab024eebe63c52a147 WatchSource:0}: Error finding container 8f32c9ef6108ab661128951d3e1115bf648990351307b7ab024eebe63c52a147: Status 404 returned error can't find the container with id 8f32c9ef6108ab661128951d3e1115bf648990351307b7ab024eebe63c52a147 Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.548247 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.560936 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.573513 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.575736 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.589273 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx"] Sep 30 17:31:15 crc kubenswrapper[4778]: W0930 17:31:15.595308 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fbf26ad_2f08_4506_8d77_c0162b8792f5.slice/crio-f03d201448520ce8b11f4d2683f66be68dce4b9b7a484e853abe9f9ce75b1830 WatchSource:0}: Error finding container f03d201448520ce8b11f4d2683f66be68dce4b9b7a484e853abe9f9ce75b1830: Status 404 returned error can't find the container with id f03d201448520ce8b11f4d2683f66be68dce4b9b7a484e853abe9f9ce75b1830 Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.596295 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk"] Sep 30 17:31:15 crc kubenswrapper[4778]: W0930 17:31:15.597378 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32b9788_c6d5_43ae_a777_ce4aeb3cabf0.slice/crio-b41f0dc72813330e965863521c85a663fd94848ff92bf3dd014593736a1b9686 WatchSource:0}: Error finding container b41f0dc72813330e965863521c85a663fd94848ff92bf3dd014593736a1b9686: Status 404 returned error can't find the container with id b41f0dc72813330e965863521c85a663fd94848ff92bf3dd014593736a1b9686 Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.599146 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.601976 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.609306 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.614794 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcfb480c-3a1f-4a9d-83a0-183c46be742a-cert\") pod \"openstack-operator-controller-manager-5468b64689-5nf52\" (UID: \"bcfb480c-3a1f-4a9d-83a0-183c46be742a\") " pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.615710 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l"] Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.622240 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn"] Sep 30 17:31:15 crc kubenswrapper[4778]: W0930 17:31:15.627784 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f70afd9_8a72_4776_8586_6afee1834e3f.slice/crio-1334762c464e5ccfb3ac6402540f828c31b8aa81395af2f1a240e8b274746f8b WatchSource:0}: Error finding container 1334762c464e5ccfb3ac6402540f828c31b8aa81395af2f1a240e8b274746f8b: Status 404 returned error can't find the container with id 1334762c464e5ccfb3ac6402540f828c31b8aa81395af2f1a240e8b274746f8b Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.653252 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv"] Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.653403 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g427v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-m4rzv_openstack-operators(3fd2b550-65c1-4163-b735-2e517b37c34c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.663006 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g"] Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.663715 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g2hcn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-c7c776c96-9s9jn_openstack-operators(4cdf1c9f-3579-41f3-9a10-05c1c8a5f241): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.663875 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-x6vhz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-vfvgx_openstack-operators(60314fa2-575c-42dc-beb7-d37d3ba69cac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.668308 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:917e6dcc519277c46e42898bc9f0f066790fa7b9633fcde668cc8a68a547c13c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dws8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5d889d78cf-rhgbx_openstack-operators(62a10d57-e47f-45de-9588-a0abb103b727): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.691774 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tgd4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-76669f99c-z6mzw_openstack-operators(84123306-eb7b-45d1-b542-f1f831949fb4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.694578 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx"] Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.697198 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ckhpn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-bc7dc7bd9-vtw6g_openstack-operators(b159e466-544e-47b1-9617-d1ffcec28b1c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: W0930 17:31:15.700374 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77616b28_3707_4b13_a2ca_efc265a63676.slice/crio-8195bc93a819d1946b8b7ed11a962ec26bc60b1290e71cb700655e4c1a236a24 WatchSource:0}: Error finding container 8195bc93a819d1946b8b7ed11a962ec26bc60b1290e71cb700655e4c1a236a24: Status 404 returned error can't find the container with id 8195bc93a819d1946b8b7ed11a962ec26bc60b1290e71cb700655e4c1a236a24 Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.701648 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" event={"ID":"7ae19188-a950-4136-b02f-264588920c60","Type":"ContainerStarted","Data":"b6025e0e89561e4d5f6c31ca4f115669c5c310e5d2647bd27fcb40ad740f8d3a"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.708438 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" event={"ID":"c32b9788-c6d5-43ae-a777-ce4aeb3cabf0","Type":"ContainerStarted","Data":"b41f0dc72813330e965863521c85a663fd94848ff92bf3dd014593736a1b9686"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.711208 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" event={"ID":"e84606fe-316d-493e-8159-d9707c2f2a47","Type":"ContainerStarted","Data":"6e8553e6d6c04bf492b032d1e25904915e31cb174fb3fee89749c3d0bd6d5435"} Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.713055 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROCESSOR_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5vzfk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx_openstack-operators(77616b28-3707-4b13-a2ca-efc265a63676): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.725144 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.727547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" event={"ID":"3fbf26ad-2f08-4506-8d77-c0162b8792f5","Type":"ContainerStarted","Data":"f03d201448520ce8b11f4d2683f66be68dce4b9b7a484e853abe9f9ce75b1830"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.727602 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" event={"ID":"9235e0ee-34ca-4be2-af88-ef695afe5224","Type":"ContainerStarted","Data":"3fddef58eb332f93878c949a4619bc8ab522e3523a8798cfa5313b66afe6ccf0"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.727632 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" event={"ID":"3fd2b550-65c1-4163-b735-2e517b37c34c","Type":"ContainerStarted","Data":"2358ffcee20118f1e96cd488bc76383530d3dc65a8fd1cdb3f40d03544db47d1"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.727644 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" event={"ID":"849e7052-8c4b-469c-8dbe-3d6d3099ed7d","Type":"ContainerStarted","Data":"8f32c9ef6108ab661128951d3e1115bf648990351307b7ab024eebe63c52a147"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.730321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" event={"ID":"9f70afd9-8a72-4776-8586-6afee1834e3f","Type":"ContainerStarted","Data":"1334762c464e5ccfb3ac6402540f828c31b8aa81395af2f1a240e8b274746f8b"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.735109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" event={"ID":"0bbfd0e9-af02-499b-98eb-c27f5eaed971","Type":"ContainerStarted","Data":"c0eae2d5421d3267982eefb27fff65bf9fae2e21462527b9a9cdd0e4ff63b591"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.739202 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerStarted","Data":"890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.744181 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" event={"ID":"dd54c570-121e-4daa-b264-fbb40a606478","Type":"ContainerStarted","Data":"7e244acf33f4be1ae6742f6b2dce93768d60286082548def10c56412919638dc"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.745603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" event={"ID":"6ecf488a-bace-4ef9-bec7-aa29f7dd85e8","Type":"ContainerStarted","Data":"695a84a9cd1a880281c254fe997dd30d45c918c8ca5d50280e43b3dcf91ec46e"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.747520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" event={"ID":"66ff4558-e436-4870-a8b0-61124b0322f7","Type":"ContainerStarted","Data":"96fa631e8a691263c0039cd034f853dc81651b2a338939d6d280bc7c6acb4828"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.750730 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" event={"ID":"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5","Type":"ContainerStarted","Data":"1f5705a4813cff7e3b3bb34f7daf4c7e339e30b1343109860daddca6b05c4f89"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.751832 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" event={"ID":"89a96b51-b353-47e4-8242-9de93451c210","Type":"ContainerStarted","Data":"8d90ed04afa7003d91d32e35f739dd57779243d7b08c625921ed425ceeaa50bc"} Sep 30 17:31:15 crc kubenswrapper[4778]: I0930 17:31:15.762905 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2s94l" podStartSLOduration=2.930623841 podStartE2EDuration="5.762889527s" podCreationTimestamp="2025-09-30 17:31:10 +0000 UTC" firstStartedPulling="2025-09-30 17:31:12.557371739 +0000 UTC m=+811.547269572" lastFinishedPulling="2025-09-30 17:31:15.389637425 +0000 UTC m=+814.379535258" observedRunningTime="2025-09-30 17:31:15.759504296 +0000 UTC m=+814.749402119" watchObservedRunningTime="2025-09-30 17:31:15.762889527 +0000 UTC m=+814.752787330" Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.878856 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" podUID="4cdf1c9f-3579-41f3-9a10-05c1c8a5f241" Sep 30 17:31:15 crc kubenswrapper[4778]: E0930 17:31:15.879133 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" podUID="60314fa2-575c-42dc-beb7-d37d3ba69cac" Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.034939 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" podUID="84123306-eb7b-45d1-b542-f1f831949fb4" Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.038971 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" podUID="3fd2b550-65c1-4163-b735-2e517b37c34c" Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.043481 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" podUID="62a10d57-e47f-45de-9588-a0abb103b727" Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.100105 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" podUID="77616b28-3707-4b13-a2ca-efc265a63676" Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.107179 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" podUID="b159e466-544e-47b1-9617-d1ffcec28b1c" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.385591 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52"] Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.776406 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" event={"ID":"b159e466-544e-47b1-9617-d1ffcec28b1c","Type":"ContainerStarted","Data":"13386067f7edd1c763bc21765166a05d9c9a96870da01d44afa0f23c42d761de"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.776726 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" event={"ID":"b159e466-544e-47b1-9617-d1ffcec28b1c","Type":"ContainerStarted","Data":"98d5aabfc9150680a6483ff3d25ea109c24ca2d84bb8e97cecc7e5d99ae92bac"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.779701 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" podUID="b159e466-544e-47b1-9617-d1ffcec28b1c" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.786124 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" event={"ID":"84123306-eb7b-45d1-b542-f1f831949fb4","Type":"ContainerStarted","Data":"0eb81f23cacc39d621804310abb35fa18e996c37762be347fc87e13ab8d4fddf"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.786176 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" event={"ID":"84123306-eb7b-45d1-b542-f1f831949fb4","Type":"ContainerStarted","Data":"63b9c12a99d408d5d342c0e4aa3491499ff3a557a8982e483c319b7db3d9f686"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.795297 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" podUID="84123306-eb7b-45d1-b542-f1f831949fb4" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.796397 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" event={"ID":"62a10d57-e47f-45de-9588-a0abb103b727","Type":"ContainerStarted","Data":"022b53a828ab996b93f8a532c989317150ae905f3138803f8632f6e05eef71fa"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.796428 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" event={"ID":"62a10d57-e47f-45de-9588-a0abb103b727","Type":"ContainerStarted","Data":"86d8dd21a420b3f670af5cb8b584bba73c41a0d82c5d958ae04e5fc03007058e"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.798448 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:917e6dcc519277c46e42898bc9f0f066790fa7b9633fcde668cc8a68a547c13c\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" podUID="62a10d57-e47f-45de-9588-a0abb103b727" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.798921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" event={"ID":"bcfb480c-3a1f-4a9d-83a0-183c46be742a","Type":"ContainerStarted","Data":"8ac07237cdad64e0d9fa82155601b0b25391eb0a79c47c46c7c4ffcc29c4cdfc"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.798942 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" event={"ID":"bcfb480c-3a1f-4a9d-83a0-183c46be742a","Type":"ContainerStarted","Data":"d7db520feffecf85d30394869afe4026b286d788308b8854cc90cb5f1692f317"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.800023 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" event={"ID":"4cdf1c9f-3579-41f3-9a10-05c1c8a5f241","Type":"ContainerStarted","Data":"3350567944e8a400eecfff20fe8b93aeee16662a026deb7cffb0e942d0bd5e84"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.800056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" event={"ID":"4cdf1c9f-3579-41f3-9a10-05c1c8a5f241","Type":"ContainerStarted","Data":"8535b9f1e72f57079b43bac7cb56dc33419194ddfa86ddac0d0721560ecb88a7"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.801365 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" podUID="4cdf1c9f-3579-41f3-9a10-05c1c8a5f241" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.805893 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" event={"ID":"60314fa2-575c-42dc-beb7-d37d3ba69cac","Type":"ContainerStarted","Data":"06fb126fadb9782131d270093e91d8a15e2c46609c2c502268daa1cdf79133e8"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.805920 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" event={"ID":"60314fa2-575c-42dc-beb7-d37d3ba69cac","Type":"ContainerStarted","Data":"faa5cc221cae0bf67516a99fcd7c349b60b341e73aa1f85433730f12ec50f18a"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.827469 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" podUID="60314fa2-575c-42dc-beb7-d37d3ba69cac" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.827717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" event={"ID":"77616b28-3707-4b13-a2ca-efc265a63676","Type":"ContainerStarted","Data":"cf437c59be823bbc4fbde74058e19e6470fdba5e06a734b61b5de32607698994"} Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.827747 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" event={"ID":"77616b28-3707-4b13-a2ca-efc265a63676","Type":"ContainerStarted","Data":"8195bc93a819d1946b8b7ed11a962ec26bc60b1290e71cb700655e4c1a236a24"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.830724 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" podUID="77616b28-3707-4b13-a2ca-efc265a63676" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.844743 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" event={"ID":"3fd2b550-65c1-4163-b735-2e517b37c34c","Type":"ContainerStarted","Data":"4f64d4415f4a0fb0532317ead7381ec115e997a0d2c8be90432b5a7cb1b7aad1"} Sep 30 17:31:16 crc kubenswrapper[4778]: E0930 17:31:16.856985 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" podUID="3fd2b550-65c1-4163-b735-2e517b37c34c" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.915882 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.915954 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:16 crc kubenswrapper[4778]: I0930 17:31:16.985402 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:17 crc kubenswrapper[4778]: I0930 17:31:17.880728 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" event={"ID":"bcfb480c-3a1f-4a9d-83a0-183c46be742a","Type":"ContainerStarted","Data":"530f155754db70668c5b839394fbec6fd4fd5dbe951baa4424899e5d42c5dc27"} Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.882598 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:7169dfadf5f5589f14ca52700d2eba991c2a0c7733f6a1ea795752d993d7f61b\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" podUID="84123306-eb7b-45d1-b542-f1f831949fb4" Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.886432 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:917e6dcc519277c46e42898bc9f0f066790fa7b9633fcde668cc8a68a547c13c\\\"\"" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" podUID="62a10d57-e47f-45de-9588-a0abb103b727" Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.886494 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" podUID="3fd2b550-65c1-4163-b735-2e517b37c34c" Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.886537 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e3f947e9034a951620a76eaf41ceec95eefcef0eacb251b10993d6820d5e1af6\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" podUID="77616b28-3707-4b13-a2ca-efc265a63676" Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.886559 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" podUID="b159e466-544e-47b1-9617-d1ffcec28b1c" Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.886599 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" podUID="60314fa2-575c-42dc-beb7-d37d3ba69cac" Sep 30 17:31:17 crc kubenswrapper[4778]: E0930 17:31:17.904906 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:057de94f9afa340adc34f9b25f8007d9cd2ba71bc8b5d77aac522add53b7caef\\\"\"" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" podUID="4cdf1c9f-3579-41f3-9a10-05c1c8a5f241" Sep 30 17:31:17 crc kubenswrapper[4778]: I0930 17:31:17.955686 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:18 crc kubenswrapper[4778]: I0930 17:31:18.887979 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:18 crc kubenswrapper[4778]: I0930 17:31:18.923030 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" podStartSLOduration=5.92301482 podStartE2EDuration="5.92301482s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:31:18.920136326 +0000 UTC m=+817.910034129" watchObservedRunningTime="2025-09-30 17:31:18.92301482 +0000 UTC m=+817.912912623" Sep 30 17:31:19 crc kubenswrapper[4778]: I0930 17:31:19.134736 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcxpz"] Sep 30 17:31:19 crc kubenswrapper[4778]: I0930 17:31:19.894638 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bcxpz" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="registry-server" containerID="cri-o://928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712" gracePeriod=2 Sep 30 17:31:20 crc kubenswrapper[4778]: I0930 17:31:20.932405 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:20 crc kubenswrapper[4778]: I0930 17:31:20.932706 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:21 crc kubenswrapper[4778]: I0930 17:31:21.005883 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:21 crc kubenswrapper[4778]: I0930 17:31:21.928227 4778 generic.go:334] "Generic (PLEG): container finished" podID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerID="928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712" exitCode=0 Sep 30 17:31:21 crc kubenswrapper[4778]: I0930 17:31:21.928266 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcxpz" event={"ID":"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b","Type":"ContainerDied","Data":"928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712"} Sep 30 17:31:21 crc kubenswrapper[4778]: I0930 17:31:21.985812 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:22 crc kubenswrapper[4778]: I0930 17:31:22.537186 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2s94l"] Sep 30 17:31:23 crc kubenswrapper[4778]: I0930 17:31:23.946263 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2s94l" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="registry-server" containerID="cri-o://890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e" gracePeriod=2 Sep 30 17:31:24 crc kubenswrapper[4778]: I0930 17:31:24.955265 4778 generic.go:334] "Generic (PLEG): container finished" podID="63657eef-94dd-4312-9807-6214904375ed" containerID="890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e" exitCode=0 Sep 30 17:31:24 crc kubenswrapper[4778]: I0930 17:31:24.955302 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerDied","Data":"890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e"} Sep 30 17:31:25 crc kubenswrapper[4778]: I0930 17:31:25.731802 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5468b64689-5nf52" Sep 30 17:31:26 crc kubenswrapper[4778]: E0930 17:31:26.913946 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712 is running failed: container process not found" containerID="928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:31:26 crc kubenswrapper[4778]: E0930 17:31:26.914926 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712 is running failed: container process not found" containerID="928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:31:26 crc kubenswrapper[4778]: E0930 17:31:26.915372 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712 is running failed: container process not found" containerID="928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:31:26 crc kubenswrapper[4778]: E0930 17:31:26.915409 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-bcxpz" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="registry-server" Sep 30 17:31:30 crc kubenswrapper[4778]: E0930 17:31:30.933535 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e is running failed: container process not found" containerID="890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:31:30 crc kubenswrapper[4778]: E0930 17:31:30.934379 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e is running failed: container process not found" containerID="890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:31:30 crc kubenswrapper[4778]: E0930 17:31:30.934873 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e is running failed: container process not found" containerID="890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 17:31:30 crc kubenswrapper[4778]: E0930 17:31:30.934934 4778 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-2s94l" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="registry-server" Sep 30 17:31:32 crc kubenswrapper[4778]: E0930 17:31:32.006715 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1e2c65f4331a2bb568d97fbcd02e3bca2627e133a794e1e4fd13368e86ce6bd1" Sep 30 17:31:32 crc kubenswrapper[4778]: E0930 17:31:32.007000 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1e2c65f4331a2bb568d97fbcd02e3bca2627e133a794e1e4fd13368e86ce6bd1,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4q2mz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-644bddb6d8-49dlx_openstack-operators(08b975a6-8387-4c2b-ab76-c34f30ac2f02): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:31:32 crc kubenswrapper[4778]: E0930 17:31:32.605696 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f" Sep 30 17:31:32 crc kubenswrapper[4778]: E0930 17:31:32.606348 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5nzss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-7d857cc749-kx62l_openstack-operators(aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:31:34 crc kubenswrapper[4778]: E0930 17:31:34.131405 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b" Sep 30 17:31:34 crc kubenswrapper[4778]: E0930 17:31:34.131576 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qrj4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-hrfgk_openstack-operators(dd54c570-121e-4daa-b264-fbb40a606478): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:31:34 crc kubenswrapper[4778]: E0930 17:31:34.133448 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" podUID="dd54c570-121e-4daa-b264-fbb40a606478" Sep 30 17:31:34 crc kubenswrapper[4778]: E0930 17:31:34.245493 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236" Sep 30 17:31:34 crc kubenswrapper[4778]: E0930 17:31:34.245881 4778 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236" Sep 30 17:31:34 crc kubenswrapper[4778]: E0930 17:31:34.245993 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pvznt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7bdb6cfb74-8hfcx_openstack-operators(9f70afd9-8a72-4776-8586-6afee1834e3f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:31:35 crc kubenswrapper[4778]: E0930 17:31:35.027925 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" podUID="dd54c570-121e-4daa-b264-fbb40a606478" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.775748 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.780003 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.932838 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghsjc\" (UniqueName: \"kubernetes.io/projected/63657eef-94dd-4312-9807-6214904375ed-kube-api-access-ghsjc\") pod \"63657eef-94dd-4312-9807-6214904375ed\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.932927 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-catalog-content\") pod \"63657eef-94dd-4312-9807-6214904375ed\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.932977 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-utilities\") pod \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.933016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-utilities\") pod \"63657eef-94dd-4312-9807-6214904375ed\" (UID: \"63657eef-94dd-4312-9807-6214904375ed\") " Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.933038 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-catalog-content\") pod \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.933077 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2l98n\" (UniqueName: \"kubernetes.io/projected/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-kube-api-access-2l98n\") pod \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\" (UID: \"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b\") " Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.934374 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-utilities" (OuterVolumeSpecName: "utilities") pod "34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" (UID: "34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.935161 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-utilities" (OuterVolumeSpecName: "utilities") pod "63657eef-94dd-4312-9807-6214904375ed" (UID: "63657eef-94dd-4312-9807-6214904375ed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.937949 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-kube-api-access-2l98n" (OuterVolumeSpecName: "kube-api-access-2l98n") pod "34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" (UID: "34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b"). InnerVolumeSpecName "kube-api-access-2l98n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.938101 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63657eef-94dd-4312-9807-6214904375ed-kube-api-access-ghsjc" (OuterVolumeSpecName: "kube-api-access-ghsjc") pod "63657eef-94dd-4312-9807-6214904375ed" (UID: "63657eef-94dd-4312-9807-6214904375ed"). InnerVolumeSpecName "kube-api-access-ghsjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:31:35 crc kubenswrapper[4778]: I0930 17:31:35.946689 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" (UID: "34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.010200 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63657eef-94dd-4312-9807-6214904375ed" (UID: "63657eef-94dd-4312-9807-6214904375ed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.034164 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghsjc\" (UniqueName: \"kubernetes.io/projected/63657eef-94dd-4312-9807-6214904375ed-kube-api-access-ghsjc\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.034199 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.034233 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.034249 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63657eef-94dd-4312-9807-6214904375ed-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.034261 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.034272 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2l98n\" (UniqueName: \"kubernetes.io/projected/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b-kube-api-access-2l98n\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.036845 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2s94l" event={"ID":"63657eef-94dd-4312-9807-6214904375ed","Type":"ContainerDied","Data":"2e372cec492d058d4710cdc44105debd389fbe9abfec7f00f82c703ebd51be32"} Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.036926 4778 scope.go:117] "RemoveContainer" containerID="890ecf0d435909821dba004247ac176454d6c274b91954b7e2fac5a63042111e" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.036856 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2s94l" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.040078 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bcxpz" event={"ID":"34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b","Type":"ContainerDied","Data":"9c3949234816e86bdebcfe0f463a5ce9beb697dc9810dedd845013d1192cbb8a"} Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.040160 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bcxpz" Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.082232 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcxpz"] Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.091496 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bcxpz"] Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.095759 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2s94l"] Sep 30 17:31:36 crc kubenswrapper[4778]: I0930 17:31:36.099437 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2s94l"] Sep 30 17:31:37 crc kubenswrapper[4778]: I0930 17:31:37.545576 4778 scope.go:117] "RemoveContainer" containerID="0d3d9ac64691ff3260d9a7e46bb3a2de3bd1bba80ccdb13646888ff2adb3c6cd" Sep 30 17:31:37 crc kubenswrapper[4778]: I0930 17:31:37.725097 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" path="/var/lib/kubelet/pods/34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b/volumes" Sep 30 17:31:37 crc kubenswrapper[4778]: I0930 17:31:37.726663 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63657eef-94dd-4312-9807-6214904375ed" path="/var/lib/kubelet/pods/63657eef-94dd-4312-9807-6214904375ed/volumes" Sep 30 17:31:37 crc kubenswrapper[4778]: I0930 17:31:37.745194 4778 scope.go:117] "RemoveContainer" containerID="ff2f767c2a105e1f0f32194ad9ce7bcc86253f284b5cbd656df1c0ca4ecb0d62" Sep 30 17:31:37 crc kubenswrapper[4778]: I0930 17:31:37.844026 4778 scope.go:117] "RemoveContainer" containerID="928b9cbeede98e8046a2ff4fce868f23a06ef221c20db11734eae80739268712" Sep 30 17:31:37 crc kubenswrapper[4778]: I0930 17:31:37.956788 4778 scope.go:117] "RemoveContainer" containerID="cadd4f81304b7077cefbdb6c4d4390f69ac9d1a04d9023cc8daa1e6778b5dbd3" Sep 30 17:31:38 crc kubenswrapper[4778]: I0930 17:31:38.024902 4778 scope.go:117] "RemoveContainer" containerID="72f2f957e64e5a13361ad669f83ed6b96713d574c01475be5d0eef065608c357" Sep 30 17:31:38 crc kubenswrapper[4778]: I0930 17:31:38.059701 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" event={"ID":"6ecf488a-bace-4ef9-bec7-aa29f7dd85e8","Type":"ContainerStarted","Data":"a5a8ee49b958ecf9286d401eee3bb2200ecc6995ad229d56a7b8bc23f36c62e7"} Sep 30 17:31:38 crc kubenswrapper[4778]: I0930 17:31:38.063081 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" event={"ID":"0bbfd0e9-af02-499b-98eb-c27f5eaed971","Type":"ContainerStarted","Data":"6cdf817104dc0319cc7ea24d2964d12cc8fd9ea99514f1d9a6eb09dbb001f700"} Sep 30 17:31:38 crc kubenswrapper[4778]: E0930 17:31:38.271053 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" podUID="08b975a6-8387-4c2b-ab76-c34f30ac2f02" Sep 30 17:31:38 crc kubenswrapper[4778]: E0930 17:31:38.356989 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" podUID="aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5" Sep 30 17:31:38 crc kubenswrapper[4778]: E0930 17:31:38.504901 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" podUID="9f70afd9-8a72-4776-8586-6afee1834e3f" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.099234 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" event={"ID":"6ecf488a-bace-4ef9-bec7-aa29f7dd85e8","Type":"ContainerStarted","Data":"cb4e9712600589c5d861beb148eaf8260f875763cbfd65e98e7c9cdc7492cfa0"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.100238 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.101675 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" event={"ID":"3fbf26ad-2f08-4506-8d77-c0162b8792f5","Type":"ContainerStarted","Data":"d90c542450568baa69c06feb2bcf4398f0eac05fe33a6105956127d69ee8a819"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.115558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" event={"ID":"77616b28-3707-4b13-a2ca-efc265a63676","Type":"ContainerStarted","Data":"c4198735fbd543255c5dafb6cbce337ada1b1ebfc556ec750d9abaa2422efc89"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.116195 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.126800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" event={"ID":"66ff4558-e436-4870-a8b0-61124b0322f7","Type":"ContainerStarted","Data":"f6f18a1fa5cd576185bed4a9963937daa9baf0d2e2a68f31f881b7afe351dc2f"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.131319 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" event={"ID":"e84606fe-316d-493e-8159-d9707c2f2a47","Type":"ContainerStarted","Data":"9e2cc66b3da050d6d3fe95f87faa9cddd7f3ade1127d5f67e7ecfe67b3e84b96"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.135406 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" podStartSLOduration=5.715240043 podStartE2EDuration="26.135394078s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.280343806 +0000 UTC m=+814.270241619" lastFinishedPulling="2025-09-30 17:31:35.700497851 +0000 UTC m=+834.690395654" observedRunningTime="2025-09-30 17:31:39.133430494 +0000 UTC m=+838.123328307" watchObservedRunningTime="2025-09-30 17:31:39.135394078 +0000 UTC m=+838.125291881" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.140126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" event={"ID":"4cdf1c9f-3579-41f3-9a10-05c1c8a5f241","Type":"ContainerStarted","Data":"fc8a97df5011e617cd5a8ccd2b1e4d954e449f7b395f3b7728a86fde99ae515c"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.140475 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.145297 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" event={"ID":"d0633e5b-e292-47b8-81e6-b752204748a9","Type":"ContainerStarted","Data":"fa53c8b3537d5e36a077378d392b5c588c18b7d62f1687f154f0b564837ddbf7"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.166459 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" event={"ID":"9235e0ee-34ca-4be2-af88-ef695afe5224","Type":"ContainerStarted","Data":"8fee0d9df719f95e400d226978c62f28e859d74c8cbd72130ef6dbbbff0a8545"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.166907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" event={"ID":"9235e0ee-34ca-4be2-af88-ef695afe5224","Type":"ContainerStarted","Data":"73fc8a3df8f0de75acb82bdccb5c9271dcbff4bf8d0dab864222053396900811"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.167857 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.170004 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" podStartSLOduration=4.321094308 podStartE2EDuration="26.169990151s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.712599023 +0000 UTC m=+814.702496826" lastFinishedPulling="2025-09-30 17:31:37.561494856 +0000 UTC m=+836.551392669" observedRunningTime="2025-09-30 17:31:39.168176712 +0000 UTC m=+838.158074515" watchObservedRunningTime="2025-09-30 17:31:39.169990151 +0000 UTC m=+838.159887954" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.185520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" event={"ID":"3fd2b550-65c1-4163-b735-2e517b37c34c","Type":"ContainerStarted","Data":"14656dc387403a5a199243a38691e24b000530485c43d9c4a51c6dbd3c29c78c"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.186304 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.195186 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" event={"ID":"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5","Type":"ContainerStarted","Data":"348fc3ddc6070ade5d0e85750cefe9696a816fe310600508b5453a383671f93a"} Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.202208 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" podUID="aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.206928 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" podStartSLOduration=6.769033155 podStartE2EDuration="26.20689694s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.351685773 +0000 UTC m=+814.341583576" lastFinishedPulling="2025-09-30 17:31:34.789549558 +0000 UTC m=+833.779447361" observedRunningTime="2025-09-30 17:31:39.206221467 +0000 UTC m=+838.196119300" watchObservedRunningTime="2025-09-30 17:31:39.20689694 +0000 UTC m=+838.196794753" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.210906 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" event={"ID":"9f70afd9-8a72-4776-8586-6afee1834e3f","Type":"ContainerStarted","Data":"d3e2dd36e96b42788af0bafd3a5878624a006e87cb125a1cec6d9b240acb0f6e"} Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.212432 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" podUID="9f70afd9-8a72-4776-8586-6afee1834e3f" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.219336 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" event={"ID":"c32b9788-c6d5-43ae-a777-ce4aeb3cabf0","Type":"ContainerStarted","Data":"b60827291bed359453da1f621a885edeb8caa125cb2632751e6e897e0b0ffac2"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.230212 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" event={"ID":"08b975a6-8387-4c2b-ab76-c34f30ac2f02","Type":"ContainerStarted","Data":"f8c3ae55494068d7c11cc9a055ad7f507f2b7d75659219f935741fd71984923c"} Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.231665 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:1e2c65f4331a2bb568d97fbcd02e3bca2627e133a794e1e4fd13368e86ce6bd1\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" podUID="08b975a6-8387-4c2b-ab76-c34f30ac2f02" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.232608 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" event={"ID":"62a10d57-e47f-45de-9588-a0abb103b727","Type":"ContainerStarted","Data":"3f4e8457319d2db94b1439bed111faef588d3687cf7c173eff2c3f11178c07cb"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.232914 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.259016 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" event={"ID":"60314fa2-575c-42dc-beb7-d37d3ba69cac","Type":"ContainerStarted","Data":"a0f94922eefbc4799cc2a725de71cd858423a1d8724ed79c4b5b8fa1de99dd1d"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.259727 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.284887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" event={"ID":"0bbfd0e9-af02-499b-98eb-c27f5eaed971","Type":"ContainerStarted","Data":"b271c0684dd9f9be59346a38cd340a898f830dcd9c2da8a1358e045221abbc56"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.285518 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.286724 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" event={"ID":"7ae19188-a950-4136-b02f-264588920c60","Type":"ContainerStarted","Data":"2d59ade6d30cf77f9a0a24ec972e0e4146d79356ef0f4588f4dd47aa819dc700"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.287855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" event={"ID":"b159e466-544e-47b1-9617-d1ffcec28b1c","Type":"ContainerStarted","Data":"a2847e15f51f6e2e92096cd8b2953175771f46b48d71578172e82f34c84bf69a"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.288969 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.297906 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" podStartSLOduration=4.399479185 podStartE2EDuration="26.297887725s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.663285042 +0000 UTC m=+814.653182845" lastFinishedPulling="2025-09-30 17:31:37.561693572 +0000 UTC m=+836.551591385" observedRunningTime="2025-09-30 17:31:39.244996107 +0000 UTC m=+838.234893910" watchObservedRunningTime="2025-09-30 17:31:39.297887725 +0000 UTC m=+838.287785528" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.304061 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" event={"ID":"849e7052-8c4b-469c-8dbe-3d6d3099ed7d","Type":"ContainerStarted","Data":"b6c2e9257fe53220efa109b28c49485eb25ce56a5869b174e05441e2036a04e5"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.309746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" event={"ID":"84123306-eb7b-45d1-b542-f1f831949fb4","Type":"ContainerStarted","Data":"f6a5202b193d34b57b31108237061885dec14d0949607500facc798f7404fae0"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.310639 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.329102 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" podStartSLOduration=4.43690945 podStartE2EDuration="26.329086688s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.668022786 +0000 UTC m=+814.657920589" lastFinishedPulling="2025-09-30 17:31:37.560200014 +0000 UTC m=+836.550097827" observedRunningTime="2025-09-30 17:31:39.301367957 +0000 UTC m=+838.291265760" watchObservedRunningTime="2025-09-30 17:31:39.329086688 +0000 UTC m=+838.318984491" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.341065 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" event={"ID":"89a96b51-b353-47e4-8242-9de93451c210","Type":"ContainerStarted","Data":"289487c7c3e8610eccd3423bdbf236590d6902f614a48328b9610b76d25c47dc"} Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.360556 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" podStartSLOduration=4.453115335 podStartE2EDuration="26.360538149s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.653237215 +0000 UTC m=+814.643135018" lastFinishedPulling="2025-09-30 17:31:37.560659989 +0000 UTC m=+836.550557832" observedRunningTime="2025-09-30 17:31:39.359316579 +0000 UTC m=+838.349214382" watchObservedRunningTime="2025-09-30 17:31:39.360538149 +0000 UTC m=+838.350435952" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.430151 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" podStartSLOduration=4.56534758 podStartE2EDuration="26.430135049s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.696875713 +0000 UTC m=+814.686773516" lastFinishedPulling="2025-09-30 17:31:37.561663162 +0000 UTC m=+836.551560985" observedRunningTime="2025-09-30 17:31:39.428371292 +0000 UTC m=+838.418269095" watchObservedRunningTime="2025-09-30 17:31:39.430135049 +0000 UTC m=+838.420032852" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.450371 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" podStartSLOduration=4.517404554 podStartE2EDuration="26.450354726s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.663780748 +0000 UTC m=+814.653678551" lastFinishedPulling="2025-09-30 17:31:37.59673091 +0000 UTC m=+836.586628723" observedRunningTime="2025-09-30 17:31:39.44802219 +0000 UTC m=+838.437919993" watchObservedRunningTime="2025-09-30 17:31:39.450354726 +0000 UTC m=+838.440252529" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.471586 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" podStartSLOduration=4.581478245 podStartE2EDuration="26.471567215s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.691659793 +0000 UTC m=+814.681557596" lastFinishedPulling="2025-09-30 17:31:37.581748753 +0000 UTC m=+836.571646566" observedRunningTime="2025-09-30 17:31:39.471106389 +0000 UTC m=+838.461004192" watchObservedRunningTime="2025-09-30 17:31:39.471567215 +0000 UTC m=+838.461465018" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.498340 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" podStartSLOduration=7.054278418 podStartE2EDuration="26.498322834s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.345936756 +0000 UTC m=+814.335834569" lastFinishedPulling="2025-09-30 17:31:34.789981182 +0000 UTC m=+833.779878985" observedRunningTime="2025-09-30 17:31:39.496564337 +0000 UTC m=+838.486462140" watchObservedRunningTime="2025-09-30 17:31:39.498322834 +0000 UTC m=+838.488220637" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.827155 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bcvm2"] Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.828075 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="extract-utilities" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.828162 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="extract-utilities" Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.828258 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="extract-utilities" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.828349 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="extract-utilities" Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.829190 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="extract-content" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.832734 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="extract-content" Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.833003 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="extract-content" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.833057 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="extract-content" Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.833119 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="registry-server" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.833174 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="registry-server" Sep 30 17:31:39 crc kubenswrapper[4778]: E0930 17:31:39.833249 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="registry-server" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.833305 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="registry-server" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.833562 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="34b5750f-fb14-4b9f-b27d-8d9a4d4e5a4b" containerName="registry-server" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.833644 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="63657eef-94dd-4312-9807-6214904375ed" containerName="registry-server" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.835038 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:39 crc kubenswrapper[4778]: I0930 17:31:39.837969 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcvm2"] Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.004929 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-catalog-content\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.005826 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd5qw\" (UniqueName: \"kubernetes.io/projected/89f474d7-03a5-422d-8d39-24d4951e2ad3-kube-api-access-qd5qw\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.005975 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-utilities\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.107425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-utilities\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.108219 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-catalog-content\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.108344 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd5qw\" (UniqueName: \"kubernetes.io/projected/89f474d7-03a5-422d-8d39-24d4951e2ad3-kube-api-access-qd5qw\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.109142 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-utilities\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.109417 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-catalog-content\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.142108 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd5qw\" (UniqueName: \"kubernetes.io/projected/89f474d7-03a5-422d-8d39-24d4951e2ad3-kube-api-access-qd5qw\") pod \"community-operators-bcvm2\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.154329 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.369490 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" event={"ID":"e84606fe-316d-493e-8159-d9707c2f2a47","Type":"ContainerStarted","Data":"bca5811aaad48acb70966873c5b4b19c2734269cb52c136b9831cdb4fc66a986"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.371052 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.390606 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" podStartSLOduration=7.478395209 podStartE2EDuration="27.390585319s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:14.874796576 +0000 UTC m=+813.864694389" lastFinishedPulling="2025-09-30 17:31:34.786986696 +0000 UTC m=+833.776884499" observedRunningTime="2025-09-30 17:31:40.388836512 +0000 UTC m=+839.378734305" watchObservedRunningTime="2025-09-30 17:31:40.390585319 +0000 UTC m=+839.380483122" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.399930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" event={"ID":"3fbf26ad-2f08-4506-8d77-c0162b8792f5","Type":"ContainerStarted","Data":"0ba4672ba104872bab5a990f73c8591384b64eff290c1c1f62fd03e114bc9fab"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.400943 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.407247 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" event={"ID":"d0633e5b-e292-47b8-81e6-b752204748a9","Type":"ContainerStarted","Data":"fb0535d609a01b9a2d34d1ab7c63acd9962bac0d7dccee596653ab57bb69fafc"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.407687 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.424788 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" event={"ID":"7ae19188-a950-4136-b02f-264588920c60","Type":"ContainerStarted","Data":"c5958d9f9244edc0144d4e04dfb3cc151df8362089752b33e86fda0943d8a9d2"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.425269 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.426792 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" event={"ID":"66ff4558-e436-4870-a8b0-61124b0322f7","Type":"ContainerStarted","Data":"335ba4fc97c4511ad79349a58bf67c9e03a5c2f5314ba7f042fe47db43a4d4fb"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.427165 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.427189 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" podStartSLOduration=8.24578213 podStartE2EDuration="27.427168727s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.608700519 +0000 UTC m=+814.598598322" lastFinishedPulling="2025-09-30 17:31:34.790087116 +0000 UTC m=+833.779984919" observedRunningTime="2025-09-30 17:31:40.423098055 +0000 UTC m=+839.412995858" watchObservedRunningTime="2025-09-30 17:31:40.427168727 +0000 UTC m=+839.417066530" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.428786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" event={"ID":"89a96b51-b353-47e4-8242-9de93451c210","Type":"ContainerStarted","Data":"eb24624bea91add941d53520abb003c70d0434cfaced1d8f381642083738b902"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.436670 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.459989 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" event={"ID":"849e7052-8c4b-469c-8dbe-3d6d3099ed7d","Type":"ContainerStarted","Data":"1c96c6601097a1da7d87129b9bdfdae067c094b7ec2b1f8032530daa57c70ad7"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.460255 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.467268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" event={"ID":"c32b9788-c6d5-43ae-a777-ce4aeb3cabf0","Type":"ContainerStarted","Data":"9f620adaeef6bb0b853e28dd1462ded423c85543f7b9645fa18e2e60034ef3d9"} Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.467304 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:40 crc kubenswrapper[4778]: E0930 17:31:40.468679 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.51:5001/openstack-k8s-operators/telemetry-operator:ac359d938872c47e1f3d7d8466b12f9d1f8a5236\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" podUID="9f70afd9-8a72-4776-8586-6afee1834e3f" Sep 30 17:31:40 crc kubenswrapper[4778]: E0930 17:31:40.469212 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:de99ad053f95f132f62b38335b2e8bf22fc28acbd441c3814764d63b63ef755f\\\"\"" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" podUID="aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5" Sep 30 17:31:40 crc kubenswrapper[4778]: E0930 17:31:40.471293 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:1e2c65f4331a2bb568d97fbcd02e3bca2627e133a794e1e4fd13368e86ce6bd1\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" podUID="08b975a6-8387-4c2b-ab76-c34f30ac2f02" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.481805 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" podStartSLOduration=8.282499664 podStartE2EDuration="27.481791791s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.587712788 +0000 UTC m=+814.577610591" lastFinishedPulling="2025-09-30 17:31:34.787004915 +0000 UTC m=+833.776902718" observedRunningTime="2025-09-30 17:31:40.479229578 +0000 UTC m=+839.469127381" watchObservedRunningTime="2025-09-30 17:31:40.481791791 +0000 UTC m=+839.471689594" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.482271 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" podStartSLOduration=7.028018874 podStartE2EDuration="27.482264657s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:14.335789122 +0000 UTC m=+813.325686925" lastFinishedPulling="2025-09-30 17:31:34.790034905 +0000 UTC m=+833.779932708" observedRunningTime="2025-09-30 17:31:40.45896953 +0000 UTC m=+839.448867333" watchObservedRunningTime="2025-09-30 17:31:40.482264657 +0000 UTC m=+839.472162460" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.496181 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" podStartSLOduration=7.626885273 podStartE2EDuration="27.496162438s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:14.918140994 +0000 UTC m=+813.908038807" lastFinishedPulling="2025-09-30 17:31:34.787418169 +0000 UTC m=+833.777315972" observedRunningTime="2025-09-30 17:31:40.489498832 +0000 UTC m=+839.479396635" watchObservedRunningTime="2025-09-30 17:31:40.496162438 +0000 UTC m=+839.486060241" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.536875 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" podStartSLOduration=7.181418866 podStartE2EDuration="27.536858759s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.344603383 +0000 UTC m=+814.334501196" lastFinishedPulling="2025-09-30 17:31:35.700043286 +0000 UTC m=+834.689941089" observedRunningTime="2025-09-30 17:31:40.514557685 +0000 UTC m=+839.504455488" watchObservedRunningTime="2025-09-30 17:31:40.536858759 +0000 UTC m=+839.526756562" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.582327 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" podStartSLOduration=7.483181025 podStartE2EDuration="27.582305545s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.599852492 +0000 UTC m=+814.589750295" lastFinishedPulling="2025-09-30 17:31:35.698977012 +0000 UTC m=+834.688874815" observedRunningTime="2025-09-30 17:31:40.577571171 +0000 UTC m=+839.567468974" watchObservedRunningTime="2025-09-30 17:31:40.582305545 +0000 UTC m=+839.572203348" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.596112 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" podStartSLOduration=7.228474082 podStartE2EDuration="27.596085782s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.365603634 +0000 UTC m=+814.355501437" lastFinishedPulling="2025-09-30 17:31:35.733215294 +0000 UTC m=+834.723113137" observedRunningTime="2025-09-30 17:31:40.590282934 +0000 UTC m=+839.580180737" watchObservedRunningTime="2025-09-30 17:31:40.596085782 +0000 UTC m=+839.585983585" Sep 30 17:31:40 crc kubenswrapper[4778]: I0930 17:31:40.695239 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bcvm2"] Sep 30 17:31:41 crc kubenswrapper[4778]: I0930 17:31:41.479722 4778 generic.go:334] "Generic (PLEG): container finished" podID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerID="62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035" exitCode=0 Sep 30 17:31:41 crc kubenswrapper[4778]: I0930 17:31:41.479940 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerDied","Data":"62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035"} Sep 30 17:31:41 crc kubenswrapper[4778]: I0930 17:31:41.480004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerStarted","Data":"33e47f2184a0f1dda93037941ac45c308bbb8f598ce1bd2668cc2c15b059187b"} Sep 30 17:31:42 crc kubenswrapper[4778]: I0930 17:31:42.496913 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerStarted","Data":"3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb"} Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.350011 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-8w47p" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.414073 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-84vkl" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.464335 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-jhqnp" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.507474 4778 generic.go:334] "Generic (PLEG): container finished" podID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerID="3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb" exitCode=0 Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.508082 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerDied","Data":"3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb"} Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.527483 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-jn9dq" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.532416 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-w69kc" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.606438 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-ldb8b" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.646228 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-n7ssv" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.681306 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-28qfs" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.725360 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-rhgbx" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.821386 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-x9xzq" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.823831 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-9s9jn" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.841895 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-ztxmv" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.896007 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-m4rzv" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.913753 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vfvgx" Sep 30 17:31:43 crc kubenswrapper[4778]: I0930 17:31:43.953390 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-vtw6g" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.013532 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-j4dhd" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.162586 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-76669f99c-z6mzw" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.492604 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.519760 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerStarted","Data":"094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd"} Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.606274 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bcvm2" podStartSLOduration=3.165799989 podStartE2EDuration="5.606254521s" podCreationTimestamp="2025-09-30 17:31:39 +0000 UTC" firstStartedPulling="2025-09-30 17:31:41.482720926 +0000 UTC m=+840.472618769" lastFinishedPulling="2025-09-30 17:31:43.923175498 +0000 UTC m=+842.913073301" observedRunningTime="2025-09-30 17:31:44.60252244 +0000 UTC m=+843.592420253" watchObservedRunningTime="2025-09-30 17:31:44.606254521 +0000 UTC m=+843.596152344" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.812524 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.812600 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.812675 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.813152 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7e8e94b4bfe4e71036adf0980559cf4826c826e3aed3c9f8f0d61aee964cec6e"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:31:44 crc kubenswrapper[4778]: I0930 17:31:44.813223 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://7e8e94b4bfe4e71036adf0980559cf4826c826e3aed3c9f8f0d61aee964cec6e" gracePeriod=600 Sep 30 17:31:45 crc kubenswrapper[4778]: I0930 17:31:45.530326 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="7e8e94b4bfe4e71036adf0980559cf4826c826e3aed3c9f8f0d61aee964cec6e" exitCode=0 Sep 30 17:31:45 crc kubenswrapper[4778]: I0930 17:31:45.530435 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"7e8e94b4bfe4e71036adf0980559cf4826c826e3aed3c9f8f0d61aee964cec6e"} Sep 30 17:31:45 crc kubenswrapper[4778]: I0930 17:31:45.530702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"97044f3dbaff452261d88827459c4da3b10678dd945a7792a4a92fff2dc6be50"} Sep 30 17:31:45 crc kubenswrapper[4778]: I0930 17:31:45.530757 4778 scope.go:117] "RemoveContainer" containerID="87ec9413bfb27167167aeb00914e82262471cb92624b3bf11492e1b4663098b9" Sep 30 17:31:50 crc kubenswrapper[4778]: I0930 17:31:50.155017 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:50 crc kubenswrapper[4778]: I0930 17:31:50.155729 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:50 crc kubenswrapper[4778]: I0930 17:31:50.245758 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:50 crc kubenswrapper[4778]: I0930 17:31:50.656921 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:50 crc kubenswrapper[4778]: I0930 17:31:50.722286 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcvm2"] Sep 30 17:31:52 crc kubenswrapper[4778]: I0930 17:31:52.601679 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" event={"ID":"dd54c570-121e-4daa-b264-fbb40a606478","Type":"ContainerStarted","Data":"a79b50c13f529e3ee11ce39cc1b4dd4ecb27ceb556a603ee0fa7c0d8acb3a387"} Sep 30 17:31:52 crc kubenswrapper[4778]: I0930 17:31:52.606195 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" event={"ID":"08b975a6-8387-4c2b-ab76-c34f30ac2f02","Type":"ContainerStarted","Data":"6adf7587f31ff9505389f609d519778f9e966cf7d3a4275a43adffc64e84289a"} Sep 30 17:31:52 crc kubenswrapper[4778]: I0930 17:31:52.606518 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bcvm2" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="registry-server" containerID="cri-o://094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd" gracePeriod=2 Sep 30 17:31:52 crc kubenswrapper[4778]: I0930 17:31:52.606819 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:31:52 crc kubenswrapper[4778]: I0930 17:31:52.684082 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" podStartSLOduration=1.778406875 podStartE2EDuration="39.684042263s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:14.327166852 +0000 UTC m=+813.317064655" lastFinishedPulling="2025-09-30 17:31:52.23280223 +0000 UTC m=+851.222700043" observedRunningTime="2025-09-30 17:31:52.667077062 +0000 UTC m=+851.656974875" watchObservedRunningTime="2025-09-30 17:31:52.684042263 +0000 UTC m=+851.673940076" Sep 30 17:31:52 crc kubenswrapper[4778]: I0930 17:31:52.684932 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-hrfgk" podStartSLOduration=3.991134961 podStartE2EDuration="39.684836358s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.607471589 +0000 UTC m=+814.597369392" lastFinishedPulling="2025-09-30 17:31:51.301172946 +0000 UTC m=+850.291070789" observedRunningTime="2025-09-30 17:31:52.63314007 +0000 UTC m=+851.623037923" watchObservedRunningTime="2025-09-30 17:31:52.684836358 +0000 UTC m=+851.674734181" Sep 30 17:31:52 crc kubenswrapper[4778]: E0930 17:31:52.836334 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f474d7_03a5_422d_8d39_24d4951e2ad3.slice/crio-094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89f474d7_03a5_422d_8d39_24d4951e2ad3.slice/crio-conmon-094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd.scope\": RecentStats: unable to find data in memory cache]" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.241149 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.346687 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-catalog-content\") pod \"89f474d7-03a5-422d-8d39-24d4951e2ad3\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.346763 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-utilities\") pod \"89f474d7-03a5-422d-8d39-24d4951e2ad3\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.346862 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd5qw\" (UniqueName: \"kubernetes.io/projected/89f474d7-03a5-422d-8d39-24d4951e2ad3-kube-api-access-qd5qw\") pod \"89f474d7-03a5-422d-8d39-24d4951e2ad3\" (UID: \"89f474d7-03a5-422d-8d39-24d4951e2ad3\") " Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.347996 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-utilities" (OuterVolumeSpecName: "utilities") pod "89f474d7-03a5-422d-8d39-24d4951e2ad3" (UID: "89f474d7-03a5-422d-8d39-24d4951e2ad3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.356046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f474d7-03a5-422d-8d39-24d4951e2ad3-kube-api-access-qd5qw" (OuterVolumeSpecName: "kube-api-access-qd5qw") pod "89f474d7-03a5-422d-8d39-24d4951e2ad3" (UID: "89f474d7-03a5-422d-8d39-24d4951e2ad3"). InnerVolumeSpecName "kube-api-access-qd5qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.443197 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89f474d7-03a5-422d-8d39-24d4951e2ad3" (UID: "89f474d7-03a5-422d-8d39-24d4951e2ad3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.448746 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd5qw\" (UniqueName: \"kubernetes.io/projected/89f474d7-03a5-422d-8d39-24d4951e2ad3-kube-api-access-qd5qw\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.448779 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.448793 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89f474d7-03a5-422d-8d39-24d4951e2ad3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.619417 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bcvm2" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.619475 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerDied","Data":"094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd"} Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.619577 4778 scope.go:117] "RemoveContainer" containerID="094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.619269 4778 generic.go:334] "Generic (PLEG): container finished" podID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerID="094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd" exitCode=0 Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.620889 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bcvm2" event={"ID":"89f474d7-03a5-422d-8d39-24d4951e2ad3","Type":"ContainerDied","Data":"33e47f2184a0f1dda93037941ac45c308bbb8f598ce1bd2668cc2c15b059187b"} Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.624370 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" event={"ID":"aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5","Type":"ContainerStarted","Data":"14fbd018282ca177e357861b00046406846ce59b180a3e38e1553a9b0e771a7a"} Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.645174 4778 scope.go:117] "RemoveContainer" containerID="3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.649827 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" podStartSLOduration=2.85226309 podStartE2EDuration="40.649806236s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.617142784 +0000 UTC m=+814.607040587" lastFinishedPulling="2025-09-30 17:31:53.41468592 +0000 UTC m=+852.404583733" observedRunningTime="2025-09-30 17:31:53.647199192 +0000 UTC m=+852.637097005" watchObservedRunningTime="2025-09-30 17:31:53.649806236 +0000 UTC m=+852.639704039" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.688188 4778 scope.go:117] "RemoveContainer" containerID="62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.689771 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bcvm2"] Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.698916 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bcvm2"] Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.712104 4778 scope.go:117] "RemoveContainer" containerID="094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd" Sep 30 17:31:53 crc kubenswrapper[4778]: E0930 17:31:53.712599 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd\": container with ID starting with 094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd not found: ID does not exist" containerID="094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.712670 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd"} err="failed to get container status \"094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd\": rpc error: code = NotFound desc = could not find container \"094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd\": container with ID starting with 094c2cdce009381e1253cf8c536166ae7ef883951779355b55b20922df4076dd not found: ID does not exist" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.712701 4778 scope.go:117] "RemoveContainer" containerID="3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb" Sep 30 17:31:53 crc kubenswrapper[4778]: E0930 17:31:53.713944 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb\": container with ID starting with 3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb not found: ID does not exist" containerID="3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.713968 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb"} err="failed to get container status \"3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb\": rpc error: code = NotFound desc = could not find container \"3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb\": container with ID starting with 3b3b5644a4d648b2a4568d0b738c45856610ef8c478d0ccc0dd094a024515cbb not found: ID does not exist" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.713981 4778 scope.go:117] "RemoveContainer" containerID="62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035" Sep 30 17:31:53 crc kubenswrapper[4778]: E0930 17:31:53.714253 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035\": container with ID starting with 62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035 not found: ID does not exist" containerID="62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.714307 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035"} err="failed to get container status \"62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035\": rpc error: code = NotFound desc = could not find container \"62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035\": container with ID starting with 62be2f4b5d6f14defe83ad2cf4237a488621d13a6894e24c182f90b983005035 not found: ID does not exist" Sep 30 17:31:53 crc kubenswrapper[4778]: I0930 17:31:53.727102 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" path="/var/lib/kubelet/pods/89f474d7-03a5-422d-8d39-24d4951e2ad3/volumes" Sep 30 17:31:54 crc kubenswrapper[4778]: I0930 17:31:54.106523 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.544014 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rsnpq"] Sep 30 17:31:59 crc kubenswrapper[4778]: E0930 17:31:59.544886 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="registry-server" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.544898 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="registry-server" Sep 30 17:31:59 crc kubenswrapper[4778]: E0930 17:31:59.544911 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="extract-utilities" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.544918 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="extract-utilities" Sep 30 17:31:59 crc kubenswrapper[4778]: E0930 17:31:59.544930 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="extract-content" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.544937 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="extract-content" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.545079 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f474d7-03a5-422d-8d39-24d4951e2ad3" containerName="registry-server" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.545994 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.564054 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsnpq"] Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.648257 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt75h\" (UniqueName: \"kubernetes.io/projected/26b19d5f-5d03-4c6c-94e8-b37997642089-kube-api-access-bt75h\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.648324 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-utilities\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.648346 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-catalog-content\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.749946 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt75h\" (UniqueName: \"kubernetes.io/projected/26b19d5f-5d03-4c6c-94e8-b37997642089-kube-api-access-bt75h\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.749997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-utilities\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.750015 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-catalog-content\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.750589 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-catalog-content\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.750710 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-utilities\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.770500 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt75h\" (UniqueName: \"kubernetes.io/projected/26b19d5f-5d03-4c6c-94e8-b37997642089-kube-api-access-bt75h\") pod \"certified-operators-rsnpq\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:31:59 crc kubenswrapper[4778]: I0930 17:31:59.875548 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:00 crc kubenswrapper[4778]: I0930 17:32:00.537914 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rsnpq"] Sep 30 17:32:00 crc kubenswrapper[4778]: W0930 17:32:00.544093 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b19d5f_5d03_4c6c_94e8_b37997642089.slice/crio-30911cca561a9f9f121a8c3ed45c978c7c1317141ed7fba496d9e4e2679dfec9 WatchSource:0}: Error finding container 30911cca561a9f9f121a8c3ed45c978c7c1317141ed7fba496d9e4e2679dfec9: Status 404 returned error can't find the container with id 30911cca561a9f9f121a8c3ed45c978c7c1317141ed7fba496d9e4e2679dfec9 Sep 30 17:32:00 crc kubenswrapper[4778]: I0930 17:32:00.694503 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsnpq" event={"ID":"26b19d5f-5d03-4c6c-94e8-b37997642089","Type":"ContainerStarted","Data":"30911cca561a9f9f121a8c3ed45c978c7c1317141ed7fba496d9e4e2679dfec9"} Sep 30 17:32:00 crc kubenswrapper[4778]: I0930 17:32:00.697406 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" event={"ID":"9f70afd9-8a72-4776-8586-6afee1834e3f","Type":"ContainerStarted","Data":"dab2e327b2a1c45ac0fae0449ebcfa40788ea3593590023ec447aff83eaea8b5"} Sep 30 17:32:00 crc kubenswrapper[4778]: I0930 17:32:00.697674 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:32:00 crc kubenswrapper[4778]: I0930 17:32:00.720066 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" podStartSLOduration=3.061643499 podStartE2EDuration="47.720048309s" podCreationTimestamp="2025-09-30 17:31:13 +0000 UTC" firstStartedPulling="2025-09-30 17:31:15.636666248 +0000 UTC m=+814.626564051" lastFinishedPulling="2025-09-30 17:32:00.295071058 +0000 UTC m=+859.284968861" observedRunningTime="2025-09-30 17:32:00.716279536 +0000 UTC m=+859.706177339" watchObservedRunningTime="2025-09-30 17:32:00.720048309 +0000 UTC m=+859.709946122" Sep 30 17:32:01 crc kubenswrapper[4778]: I0930 17:32:01.709878 4778 generic.go:334] "Generic (PLEG): container finished" podID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerID="ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e" exitCode=0 Sep 30 17:32:01 crc kubenswrapper[4778]: I0930 17:32:01.709989 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsnpq" event={"ID":"26b19d5f-5d03-4c6c-94e8-b37997642089","Type":"ContainerDied","Data":"ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e"} Sep 30 17:32:03 crc kubenswrapper[4778]: I0930 17:32:03.416434 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-49dlx" Sep 30 17:32:03 crc kubenswrapper[4778]: I0930 17:32:03.728473 4778 generic.go:334] "Generic (PLEG): container finished" podID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerID="ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb" exitCode=0 Sep 30 17:32:03 crc kubenswrapper[4778]: I0930 17:32:03.730629 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsnpq" event={"ID":"26b19d5f-5d03-4c6c-94e8-b37997642089","Type":"ContainerDied","Data":"ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb"} Sep 30 17:32:04 crc kubenswrapper[4778]: I0930 17:32:04.112477 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-kx62l" Sep 30 17:32:04 crc kubenswrapper[4778]: I0930 17:32:04.739725 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsnpq" event={"ID":"26b19d5f-5d03-4c6c-94e8-b37997642089","Type":"ContainerStarted","Data":"466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87"} Sep 30 17:32:04 crc kubenswrapper[4778]: I0930 17:32:04.771339 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rsnpq" podStartSLOduration=3.258862491 podStartE2EDuration="5.771315532s" podCreationTimestamp="2025-09-30 17:31:59 +0000 UTC" firstStartedPulling="2025-09-30 17:32:01.712219079 +0000 UTC m=+860.702116892" lastFinishedPulling="2025-09-30 17:32:04.22467211 +0000 UTC m=+863.214569933" observedRunningTime="2025-09-30 17:32:04.761803543 +0000 UTC m=+863.751701396" watchObservedRunningTime="2025-09-30 17:32:04.771315532 +0000 UTC m=+863.761213365" Sep 30 17:32:09 crc kubenswrapper[4778]: I0930 17:32:09.876172 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:09 crc kubenswrapper[4778]: I0930 17:32:09.876906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:09 crc kubenswrapper[4778]: I0930 17:32:09.933793 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:10 crc kubenswrapper[4778]: I0930 17:32:10.874911 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:11 crc kubenswrapper[4778]: I0930 17:32:11.034013 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rsnpq"] Sep 30 17:32:12 crc kubenswrapper[4778]: I0930 17:32:12.823515 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rsnpq" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="registry-server" containerID="cri-o://466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87" gracePeriod=2 Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.366592 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.391705 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-catalog-content\") pod \"26b19d5f-5d03-4c6c-94e8-b37997642089\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.391774 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt75h\" (UniqueName: \"kubernetes.io/projected/26b19d5f-5d03-4c6c-94e8-b37997642089-kube-api-access-bt75h\") pod \"26b19d5f-5d03-4c6c-94e8-b37997642089\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.391793 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-utilities\") pod \"26b19d5f-5d03-4c6c-94e8-b37997642089\" (UID: \"26b19d5f-5d03-4c6c-94e8-b37997642089\") " Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.392983 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-utilities" (OuterVolumeSpecName: "utilities") pod "26b19d5f-5d03-4c6c-94e8-b37997642089" (UID: "26b19d5f-5d03-4c6c-94e8-b37997642089"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.404887 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26b19d5f-5d03-4c6c-94e8-b37997642089-kube-api-access-bt75h" (OuterVolumeSpecName: "kube-api-access-bt75h") pod "26b19d5f-5d03-4c6c-94e8-b37997642089" (UID: "26b19d5f-5d03-4c6c-94e8-b37997642089"). InnerVolumeSpecName "kube-api-access-bt75h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.494209 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt75h\" (UniqueName: \"kubernetes.io/projected/26b19d5f-5d03-4c6c-94e8-b37997642089-kube-api-access-bt75h\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.494552 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.524182 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26b19d5f-5d03-4c6c-94e8-b37997642089" (UID: "26b19d5f-5d03-4c6c-94e8-b37997642089"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.595991 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26b19d5f-5d03-4c6c-94e8-b37997642089-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.833276 4778 generic.go:334] "Generic (PLEG): container finished" podID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerID="466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87" exitCode=0 Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.833444 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsnpq" event={"ID":"26b19d5f-5d03-4c6c-94e8-b37997642089","Type":"ContainerDied","Data":"466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87"} Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.833524 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rsnpq" event={"ID":"26b19d5f-5d03-4c6c-94e8-b37997642089","Type":"ContainerDied","Data":"30911cca561a9f9f121a8c3ed45c978c7c1317141ed7fba496d9e4e2679dfec9"} Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.833558 4778 scope.go:117] "RemoveContainer" containerID="466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.836438 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rsnpq" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.874736 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rsnpq"] Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.879909 4778 scope.go:117] "RemoveContainer" containerID="ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.884783 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rsnpq"] Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.916407 4778 scope.go:117] "RemoveContainer" containerID="ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.952670 4778 scope.go:117] "RemoveContainer" containerID="466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87" Sep 30 17:32:13 crc kubenswrapper[4778]: E0930 17:32:13.953633 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87\": container with ID starting with 466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87 not found: ID does not exist" containerID="466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.953696 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87"} err="failed to get container status \"466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87\": rpc error: code = NotFound desc = could not find container \"466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87\": container with ID starting with 466cce5cb641f0e98077279f1af893423d1137eeb83b82c5457db6998d158b87 not found: ID does not exist" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.953735 4778 scope.go:117] "RemoveContainer" containerID="ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb" Sep 30 17:32:13 crc kubenswrapper[4778]: E0930 17:32:13.954293 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb\": container with ID starting with ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb not found: ID does not exist" containerID="ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.954358 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb"} err="failed to get container status \"ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb\": rpc error: code = NotFound desc = could not find container \"ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb\": container with ID starting with ff39c41b19fbb7f622143e777814eb27c755b9cf8e3b12396df9b9e633f48bcb not found: ID does not exist" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.954403 4778 scope.go:117] "RemoveContainer" containerID="ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e" Sep 30 17:32:13 crc kubenswrapper[4778]: E0930 17:32:13.954842 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e\": container with ID starting with ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e not found: ID does not exist" containerID="ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.954890 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e"} err="failed to get container status \"ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e\": rpc error: code = NotFound desc = could not find container \"ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e\": container with ID starting with ce54ceb199e227195c9ffeb9711af89c7ae2532c48dad66d6b51e41355c12f7e not found: ID does not exist" Sep 30 17:32:13 crc kubenswrapper[4778]: I0930 17:32:13.972074 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7bdb6cfb74-8hfcx" Sep 30 17:32:15 crc kubenswrapper[4778]: I0930 17:32:15.732093 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" path="/var/lib/kubelet/pods/26b19d5f-5d03-4c6c-94e8-b37997642089/volumes" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.888793 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7bd2"] Sep 30 17:32:29 crc kubenswrapper[4778]: E0930 17:32:29.889997 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="registry-server" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.890013 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="registry-server" Sep 30 17:32:29 crc kubenswrapper[4778]: E0930 17:32:29.890032 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="extract-content" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.890041 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="extract-content" Sep 30 17:32:29 crc kubenswrapper[4778]: E0930 17:32:29.890060 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="extract-utilities" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.890068 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="extract-utilities" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.890379 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="26b19d5f-5d03-4c6c-94e8-b37997642089" containerName="registry-server" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.891290 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.898126 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-rhkjg" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.898358 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.912527 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.913338 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.918314 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7bd2"] Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.966083 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-697zn"] Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.967667 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.972054 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 17:32:29 crc kubenswrapper[4778]: I0930 17:32:29.981302 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-697zn"] Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.016139 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-config\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.016480 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt6cr\" (UniqueName: \"kubernetes.io/projected/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-kube-api-access-bt6cr\") pod \"dnsmasq-dns-675f4bcbfc-w7bd2\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.016509 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-config\") pod \"dnsmasq-dns-675f4bcbfc-w7bd2\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.016657 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltzt7\" (UniqueName: \"kubernetes.io/projected/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-kube-api-access-ltzt7\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.016797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.118326 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-config\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.118385 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt6cr\" (UniqueName: \"kubernetes.io/projected/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-kube-api-access-bt6cr\") pod \"dnsmasq-dns-675f4bcbfc-w7bd2\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.118409 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-config\") pod \"dnsmasq-dns-675f4bcbfc-w7bd2\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.118426 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltzt7\" (UniqueName: \"kubernetes.io/projected/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-kube-api-access-ltzt7\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.118466 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.119334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.119536 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-config\") pod \"dnsmasq-dns-675f4bcbfc-w7bd2\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.119913 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-config\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.148975 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltzt7\" (UniqueName: \"kubernetes.io/projected/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-kube-api-access-ltzt7\") pod \"dnsmasq-dns-78dd6ddcc-697zn\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.159428 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt6cr\" (UniqueName: \"kubernetes.io/projected/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-kube-api-access-bt6cr\") pod \"dnsmasq-dns-675f4bcbfc-w7bd2\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.234189 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.288992 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.723371 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7bd2"] Sep 30 17:32:30 crc kubenswrapper[4778]: W0930 17:32:30.729064 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f9bc6f1_8f73_4e50_b7f2_901b11c9cfb0.slice/crio-45055cceb47f9ef2d66da8e28000fe82463171b91c6c96c6b7e39de7ee850754 WatchSource:0}: Error finding container 45055cceb47f9ef2d66da8e28000fe82463171b91c6c96c6b7e39de7ee850754: Status 404 returned error can't find the container with id 45055cceb47f9ef2d66da8e28000fe82463171b91c6c96c6b7e39de7ee850754 Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.733685 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.777031 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-697zn"] Sep 30 17:32:30 crc kubenswrapper[4778]: W0930 17:32:30.781030 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd3b0f70_4d1c_41f2_b7a2_a19e2aeb7f06.slice/crio-06388a4a7022f84d2152dcd1e1862f5645bd88302bb8ebf09ca42a2c4389c0a0 WatchSource:0}: Error finding container 06388a4a7022f84d2152dcd1e1862f5645bd88302bb8ebf09ca42a2c4389c0a0: Status 404 returned error can't find the container with id 06388a4a7022f84d2152dcd1e1862f5645bd88302bb8ebf09ca42a2c4389c0a0 Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.979753 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" event={"ID":"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06","Type":"ContainerStarted","Data":"06388a4a7022f84d2152dcd1e1862f5645bd88302bb8ebf09ca42a2c4389c0a0"} Sep 30 17:32:30 crc kubenswrapper[4778]: I0930 17:32:30.981219 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" event={"ID":"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0","Type":"ContainerStarted","Data":"45055cceb47f9ef2d66da8e28000fe82463171b91c6c96c6b7e39de7ee850754"} Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.164357 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7bd2"] Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.189783 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ch8th"] Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.199640 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ch8th"] Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.200839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.277303 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xssr\" (UniqueName: \"kubernetes.io/projected/567d6f34-762c-4560-8af1-be9c5f0a1945-kube-api-access-7xssr\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.277450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-config\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.277522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.378974 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xssr\" (UniqueName: \"kubernetes.io/projected/567d6f34-762c-4560-8af1-be9c5f0a1945-kube-api-access-7xssr\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.379084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-config\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.379138 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.380496 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-config\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.380545 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-dns-svc\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.420808 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xssr\" (UniqueName: \"kubernetes.io/projected/567d6f34-762c-4560-8af1-be9c5f0a1945-kube-api-access-7xssr\") pod \"dnsmasq-dns-666b6646f7-ch8th\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.497329 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-697zn"] Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.517978 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-6mk8g"] Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.521195 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.533400 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.554707 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-6mk8g"] Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.590571 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-config\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.590882 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpx8n\" (UniqueName: \"kubernetes.io/projected/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-kube-api-access-zpx8n\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.591108 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.691985 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-config\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.693122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpx8n\" (UniqueName: \"kubernetes.io/projected/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-kube-api-access-zpx8n\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.693277 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.694279 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.693078 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-config\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.716546 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpx8n\" (UniqueName: \"kubernetes.io/projected/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-kube-api-access-zpx8n\") pod \"dnsmasq-dns-57d769cc4f-6mk8g\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:33 crc kubenswrapper[4778]: I0930 17:32:33.851642 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.354889 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.359854 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.362232 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.363444 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.363654 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.363790 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dhbr7" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.364027 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.364424 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.364527 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.373653 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504332 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mnd2\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-kube-api-access-4mnd2\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504409 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/227061d9-b3e7-4711-92f5-283ca4af1412-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504586 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-server-conf\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504762 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504790 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504884 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504905 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504922 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504959 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-config-data\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.504984 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/227061d9-b3e7-4711-92f5-283ca4af1412-pod-info\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.605934 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.605976 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606045 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606076 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-config-data\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606090 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/227061d9-b3e7-4711-92f5-283ca4af1412-pod-info\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mnd2\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-kube-api-access-4mnd2\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606133 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/227061d9-b3e7-4711-92f5-283ca4af1412-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606157 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.606183 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-server-conf\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.607199 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.607759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-config-data\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.607754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.608296 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.608369 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-server-conf\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.611791 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.612219 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.619134 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/227061d9-b3e7-4711-92f5-283ca4af1412-pod-info\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.620344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/227061d9-b3e7-4711-92f5-283ca4af1412-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.623379 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/227061d9-b3e7-4711-92f5-283ca4af1412-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.627797 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mnd2\" (UniqueName: \"kubernetes.io/projected/227061d9-b3e7-4711-92f5-283ca4af1412-kube-api-access-4mnd2\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.642513 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.644296 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.647239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"227061d9-b3e7-4711-92f5-283ca4af1412\") " pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.650509 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.650689 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.651369 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.651763 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-d65rm" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.651933 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.652300 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.652755 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.662044 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.696925 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708023 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708076 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708112 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708145 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708201 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708256 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708289 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8h2\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-kube-api-access-5f8h2\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708335 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708369 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.708412 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.809304 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.809395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.809421 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.810128 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.809447 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.810230 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.810502 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.810258 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.810916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.811276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.811433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.812701 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.812820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8h2\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-kube-api-access-5f8h2\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.812928 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.812968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.813243 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.814265 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.814893 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.816919 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.819093 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.823003 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.836334 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8h2\" (UniqueName: \"kubernetes.io/projected/a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1-kube-api-access-5f8h2\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:34 crc kubenswrapper[4778]: I0930 17:32:34.838526 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:35 crc kubenswrapper[4778]: I0930 17:32:35.020868 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.150704 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.153395 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.159906 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.163199 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-765rj" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.163641 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.164069 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.164264 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.187462 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.187968 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.199995 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.202159 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.204667 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.205894 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-gbtf7" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.205897 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.206084 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.208631 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269634 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-config-data-default\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-secrets\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269726 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269773 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269795 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nggwv\" (UniqueName: \"kubernetes.io/projected/02bfdd7b-c081-4815-9a55-f39fa4d0384f-kube-api-access-nggwv\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269843 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269862 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269884 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269908 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-kolla-config\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.269935 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.270129 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.270167 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02bfdd7b-c081-4815-9a55-f39fa4d0384f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.270189 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.270210 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9705cb32-3888-4b76-863d-7c4dd57185bc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.270227 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.270247 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npr4t\" (UniqueName: \"kubernetes.io/projected/9705cb32-3888-4b76-863d-7c4dd57185bc-kube-api-access-npr4t\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9705cb32-3888-4b76-863d-7c4dd57185bc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371772 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371789 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npr4t\" (UniqueName: \"kubernetes.io/projected/9705cb32-3888-4b76-863d-7c4dd57185bc-kube-api-access-npr4t\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371814 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-config-data-default\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371848 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-secrets\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371864 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371887 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371916 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371942 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nggwv\" (UniqueName: \"kubernetes.io/projected/02bfdd7b-c081-4815-9a55-f39fa4d0384f-kube-api-access-nggwv\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.371997 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372017 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372041 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-kolla-config\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372081 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02bfdd7b-c081-4815-9a55-f39fa4d0384f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372125 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.372148 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.373069 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.373167 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-config-data-default\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.373393 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.374267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.374839 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/02bfdd7b-c081-4815-9a55-f39fa4d0384f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.374879 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.375055 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/02bfdd7b-c081-4815-9a55-f39fa4d0384f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.375655 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/9705cb32-3888-4b76-863d-7c4dd57185bc-kolla-config\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.377350 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/9705cb32-3888-4b76-863d-7c4dd57185bc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.379216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.379321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.382704 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.383070 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/02bfdd7b-c081-4815-9a55-f39fa4d0384f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.385952 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.388822 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/9705cb32-3888-4b76-863d-7c4dd57185bc-secrets\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.397800 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npr4t\" (UniqueName: \"kubernetes.io/projected/9705cb32-3888-4b76-863d-7c4dd57185bc-kube-api-access-npr4t\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.400248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nggwv\" (UniqueName: \"kubernetes.io/projected/02bfdd7b-c081-4815-9a55-f39fa4d0384f-kube-api-access-nggwv\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.400739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"openstack-cell1-galera-0\" (UID: \"02bfdd7b-c081-4815-9a55-f39fa4d0384f\") " pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.406469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-galera-0\" (UID: \"9705cb32-3888-4b76-863d-7c4dd57185bc\") " pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.501341 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.518833 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.677407 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.680167 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.682948 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-s7m42" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.682976 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.689009 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.690067 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.778483 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm2l5\" (UniqueName: \"kubernetes.io/projected/21244498-232a-4725-be68-da731564a70b-kube-api-access-cm2l5\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.778560 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21244498-232a-4725-be68-da731564a70b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.778585 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21244498-232a-4725-be68-da731564a70b-kolla-config\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.778600 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21244498-232a-4725-be68-da731564a70b-config-data\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.778631 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21244498-232a-4725-be68-da731564a70b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.879559 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21244498-232a-4725-be68-da731564a70b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.879605 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21244498-232a-4725-be68-da731564a70b-kolla-config\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.879639 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21244498-232a-4725-be68-da731564a70b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.879657 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21244498-232a-4725-be68-da731564a70b-config-data\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.879733 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm2l5\" (UniqueName: \"kubernetes.io/projected/21244498-232a-4725-be68-da731564a70b-kube-api-access-cm2l5\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.881222 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/21244498-232a-4725-be68-da731564a70b-config-data\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.881248 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/21244498-232a-4725-be68-da731564a70b-kolla-config\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.888928 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21244498-232a-4725-be68-da731564a70b-combined-ca-bundle\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.894719 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/21244498-232a-4725-be68-da731564a70b-memcached-tls-certs\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.905536 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm2l5\" (UniqueName: \"kubernetes.io/projected/21244498-232a-4725-be68-da731564a70b-kube-api-access-cm2l5\") pod \"memcached-0\" (UID: \"21244498-232a-4725-be68-da731564a70b\") " pod="openstack/memcached-0" Sep 30 17:32:37 crc kubenswrapper[4778]: I0930 17:32:37.997164 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.575913 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-79nnj"] Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.577495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.588403 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-qd6f4"] Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.589133 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.589744 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.590022 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-bpnp6" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.616971 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.643800 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79nnj"] Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.657084 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qd6f4"] Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-log-ovn\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673199 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-log\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673326 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43aaa67-2f2b-4045-80af-3593da87ed64-ovn-controller-tls-certs\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673383 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d43aaa67-2f2b-4045-80af-3593da87ed64-scripts\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673410 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-scripts\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673433 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-run\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673453 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d43aaa67-2f2b-4045-80af-3593da87ed64-combined-ca-bundle\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673478 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rxl\" (UniqueName: \"kubernetes.io/projected/d43aaa67-2f2b-4045-80af-3593da87ed64-kube-api-access-66rxl\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673500 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-run\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673516 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-etc-ovs\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673536 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5n7x\" (UniqueName: \"kubernetes.io/projected/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-kube-api-access-g5n7x\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-run-ovn\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.673672 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-lib\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774743 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-log\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774791 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43aaa67-2f2b-4045-80af-3593da87ed64-ovn-controller-tls-certs\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d43aaa67-2f2b-4045-80af-3593da87ed64-scripts\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774877 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-scripts\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774913 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-run\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d43aaa67-2f2b-4045-80af-3593da87ed64-combined-ca-bundle\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.774970 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rxl\" (UniqueName: \"kubernetes.io/projected/d43aaa67-2f2b-4045-80af-3593da87ed64-kube-api-access-66rxl\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.775023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-etc-ovs\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.775044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-run\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.775075 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5n7x\" (UniqueName: \"kubernetes.io/projected/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-kube-api-access-g5n7x\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.775107 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-run-ovn\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.775133 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-lib\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.775179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-log-ovn\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.776922 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-lib\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.777278 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-etc-ovs\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.779207 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d43aaa67-2f2b-4045-80af-3593da87ed64-scripts\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.780959 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-scripts\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.781523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-run-ovn\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.783802 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-log\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.783944 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-log-ovn\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.784198 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-var-run\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.784258 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d43aaa67-2f2b-4045-80af-3593da87ed64-var-run\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.786732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/d43aaa67-2f2b-4045-80af-3593da87ed64-ovn-controller-tls-certs\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.789225 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d43aaa67-2f2b-4045-80af-3593da87ed64-combined-ca-bundle\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.796500 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rxl\" (UniqueName: \"kubernetes.io/projected/d43aaa67-2f2b-4045-80af-3593da87ed64-kube-api-access-66rxl\") pod \"ovn-controller-79nnj\" (UID: \"d43aaa67-2f2b-4045-80af-3593da87ed64\") " pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.802599 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5n7x\" (UniqueName: \"kubernetes.io/projected/9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6-kube-api-access-g5n7x\") pod \"ovn-controller-ovs-qd6f4\" (UID: \"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6\") " pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.911086 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj" Sep 30 17:32:43 crc kubenswrapper[4778]: I0930 17:32:43.962299 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.105352 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ch8th"] Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.431647 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.436717 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.439037 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.440794 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.441133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.441270 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.441783 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-p2vh2" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.442577 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.484687 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485032 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485074 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c60ce32d-deec-4c60-9192-6b090ae53773-config\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485099 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxcbt\" (UniqueName: \"kubernetes.io/projected/c60ce32d-deec-4c60-9192-6b090ae53773-kube-api-access-cxcbt\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485122 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c60ce32d-deec-4c60-9192-6b090ae53773-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485143 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485161 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c60ce32d-deec-4c60-9192-6b090ae53773-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.485192 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586101 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586172 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586265 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c60ce32d-deec-4c60-9192-6b090ae53773-config\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586338 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxcbt\" (UniqueName: \"kubernetes.io/projected/c60ce32d-deec-4c60-9192-6b090ae53773-kube-api-access-cxcbt\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586442 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c60ce32d-deec-4c60-9192-6b090ae53773-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c60ce32d-deec-4c60-9192-6b090ae53773-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.586807 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.587052 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.587162 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c60ce32d-deec-4c60-9192-6b090ae53773-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.588507 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c60ce32d-deec-4c60-9192-6b090ae53773-config\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.588887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c60ce32d-deec-4c60-9192-6b090ae53773-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.592315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.593464 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.594519 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60ce32d-deec-4c60-9192-6b090ae53773-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.601480 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxcbt\" (UniqueName: \"kubernetes.io/projected/c60ce32d-deec-4c60-9192-6b090ae53773-kube-api-access-cxcbt\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.617661 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"c60ce32d-deec-4c60-9192-6b090ae53773\") " pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: W0930 17:32:44.687948 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod567d6f34_762c_4560_8af1_be9c5f0a1945.slice/crio-f02bc782a91f2351e94c8a09a57c89ad9565f53e86d624e2daac26c7be01d5db WatchSource:0}: Error finding container f02bc782a91f2351e94c8a09a57c89ad9565f53e86d624e2daac26c7be01d5db: Status 404 returned error can't find the container with id f02bc782a91f2351e94c8a09a57c89ad9565f53e86d624e2daac26c7be01d5db Sep 30 17:32:44 crc kubenswrapper[4778]: E0930 17:32:44.727951 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 17:32:44 crc kubenswrapper[4778]: E0930 17:32:44.728206 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ltzt7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-697zn_openstack(bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:32:44 crc kubenswrapper[4778]: E0930 17:32:44.730010 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" podUID="bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06" Sep 30 17:32:44 crc kubenswrapper[4778]: I0930 17:32:44.768667 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 17:32:44 crc kubenswrapper[4778]: E0930 17:32:44.809579 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Sep 30 17:32:44 crc kubenswrapper[4778]: E0930 17:32:44.810022 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bt6cr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-w7bd2_openstack(9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:32:44 crc kubenswrapper[4778]: E0930 17:32:44.812972 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" podUID="9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.106217 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" event={"ID":"567d6f34-762c-4560-8af1-be9c5f0a1945","Type":"ContainerStarted","Data":"f02bc782a91f2351e94c8a09a57c89ad9565f53e86d624e2daac26c7be01d5db"} Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.185659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-6mk8g"] Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.190421 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 17:32:45 crc kubenswrapper[4778]: W0930 17:32:45.191939 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a383937_1f37_4d7c_9ddf_6aefd9fd7329.slice/crio-eabee853c961a5a5ca9636519e1501ff700e5d56d2f6cba999558fced86ff138 WatchSource:0}: Error finding container eabee853c961a5a5ca9636519e1501ff700e5d56d2f6cba999558fced86ff138: Status 404 returned error can't find the container with id eabee853c961a5a5ca9636519e1501ff700e5d56d2f6cba999558fced86ff138 Sep 30 17:32:45 crc kubenswrapper[4778]: W0930 17:32:45.194628 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda46a2506_e4e5_4d7c_b19f_ebf1bb0922a1.slice/crio-dd7900b787a56cc1f2dd585032a9f8e3369b50f211aff115a1cb6bfb51f84232 WatchSource:0}: Error finding container dd7900b787a56cc1f2dd585032a9f8e3369b50f211aff115a1cb6bfb51f84232: Status 404 returned error can't find the container with id dd7900b787a56cc1f2dd585032a9f8e3369b50f211aff115a1cb6bfb51f84232 Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.364919 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.365437 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.604655 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.626766 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.742754 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bt6cr\" (UniqueName: \"kubernetes.io/projected/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-kube-api-access-bt6cr\") pod \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.744164 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06" (UID: "bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.746432 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-dns-svc\") pod \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.746475 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-config\") pod \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.746557 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltzt7\" (UniqueName: \"kubernetes.io/projected/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-kube-api-access-ltzt7\") pod \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\" (UID: \"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06\") " Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.746625 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-config\") pod \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\" (UID: \"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0\") " Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.747830 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-config" (OuterVolumeSpecName: "config") pod "9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0" (UID: "9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.748460 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-config" (OuterVolumeSpecName: "config") pod "bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06" (UID: "bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.751308 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-kube-api-access-bt6cr" (OuterVolumeSpecName: "kube-api-access-bt6cr") pod "9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0" (UID: "9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0"). InnerVolumeSpecName "kube-api-access-bt6cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.756135 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-kube-api-access-ltzt7" (OuterVolumeSpecName: "kube-api-access-ltzt7") pod "bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06" (UID: "bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06"). InnerVolumeSpecName "kube-api-access-ltzt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:45 crc kubenswrapper[4778]: W0930 17:32:45.808980 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21244498_232a_4725_be68_da731564a70b.slice/crio-8b1f3e91521cbbd327a55c1e0e77ec54454eaf14f156c88fcc02b3bbecfbfe01 WatchSource:0}: Error finding container 8b1f3e91521cbbd327a55c1e0e77ec54454eaf14f156c88fcc02b3bbecfbfe01: Status 404 returned error can't find the container with id 8b1f3e91521cbbd327a55c1e0e77ec54454eaf14f156c88fcc02b3bbecfbfe01 Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.849667 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltzt7\" (UniqueName: \"kubernetes.io/projected/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-kube-api-access-ltzt7\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.849718 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.849756 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bt6cr\" (UniqueName: \"kubernetes.io/projected/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0-kube-api-access-bt6cr\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.849770 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.849781 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.896681 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.896752 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79nnj"] Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.896768 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 17:32:45 crc kubenswrapper[4778]: I0930 17:32:45.896782 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 17:32:45 crc kubenswrapper[4778]: W0930 17:32:45.913522 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc60ce32d_deec_4c60_9192_6b090ae53773.slice/crio-db09a26848414eb171650e480411f6e9ab0a5cebb7088eecb31fd9cd64ddcef4 WatchSource:0}: Error finding container db09a26848414eb171650e480411f6e9ab0a5cebb7088eecb31fd9cd64ddcef4: Status 404 returned error can't find the container with id db09a26848414eb171650e480411f6e9ab0a5cebb7088eecb31fd9cd64ddcef4 Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.017316 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.019744 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.021879 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.022132 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.022409 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.022972 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-j8vnq" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.026521 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.115036 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" event={"ID":"9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0","Type":"ContainerDied","Data":"45055cceb47f9ef2d66da8e28000fe82463171b91c6c96c6b7e39de7ee850754"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.115173 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-w7bd2" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.119132 4778 generic.go:334] "Generic (PLEG): container finished" podID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerID="e40e1acaa8ff639ec2045f7fc3eae6ca844ace11d4da9c92dc556047d97fc158" exitCode=0 Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.119272 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" event={"ID":"567d6f34-762c-4560-8af1-be9c5f0a1945","Type":"ContainerDied","Data":"e40e1acaa8ff639ec2045f7fc3eae6ca844ace11d4da9c92dc556047d97fc158"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.122368 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj" event={"ID":"d43aaa67-2f2b-4045-80af-3593da87ed64","Type":"ContainerStarted","Data":"6c7b57d8185b52be2b36e61ba7993023b20092e5d91187c5a1b71f92f7983cc8"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.124080 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.124097 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-697zn" event={"ID":"bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06","Type":"ContainerDied","Data":"06388a4a7022f84d2152dcd1e1862f5645bd88302bb8ebf09ca42a2c4389c0a0"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.126038 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c60ce32d-deec-4c60-9192-6b090ae53773","Type":"ContainerStarted","Data":"db09a26848414eb171650e480411f6e9ab0a5cebb7088eecb31fd9cd64ddcef4"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.127238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02bfdd7b-c081-4815-9a55-f39fa4d0384f","Type":"ContainerStarted","Data":"6fb6ede4100614be3122b8a498206cb289d4931c5d530656eaa8aa883f3cf66f"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.128716 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1","Type":"ContainerStarted","Data":"dd7900b787a56cc1f2dd585032a9f8e3369b50f211aff115a1cb6bfb51f84232"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.130503 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" event={"ID":"7a383937-1f37-4d7c-9ddf-6aefd9fd7329","Type":"ContainerStarted","Data":"676772e74ca0e39688dac489c1f1619e6e81ab7b4db71475f44ec11116cf53a4"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.130526 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" event={"ID":"7a383937-1f37-4d7c-9ddf-6aefd9fd7329","Type":"ContainerStarted","Data":"eabee853c961a5a5ca9636519e1501ff700e5d56d2f6cba999558fced86ff138"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.133330 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"227061d9-b3e7-4711-92f5-283ca4af1412","Type":"ContainerStarted","Data":"d4dfbb8f03b86f019736a1f8afbd631e149a266bf5e1c68d762ff1d10654aed1"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.135185 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9705cb32-3888-4b76-863d-7c4dd57185bc","Type":"ContainerStarted","Data":"31239727d83db3436c0d9d698d77eddc7a8ed8068129b4845a60229ae492b929"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.141425 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"21244498-232a-4725-be68-da731564a70b","Type":"ContainerStarted","Data":"8b1f3e91521cbbd327a55c1e0e77ec54454eaf14f156c88fcc02b3bbecfbfe01"} Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154426 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce405643-c53c-472d-802c-a3d8fe7840a0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154502 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154528 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154659 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce405643-c53c-472d-802c-a3d8fe7840a0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154686 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6dns\" (UniqueName: \"kubernetes.io/projected/ce405643-c53c-472d-802c-a3d8fe7840a0-kube-api-access-v6dns\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154761 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce405643-c53c-472d-802c-a3d8fe7840a0-config\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.154884 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.217406 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-697zn"] Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.220349 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-697zn"] Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.242760 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7bd2"] Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.246712 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-w7bd2"] Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256468 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce405643-c53c-472d-802c-a3d8fe7840a0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256517 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6dns\" (UniqueName: \"kubernetes.io/projected/ce405643-c53c-472d-802c-a3d8fe7840a0-kube-api-access-v6dns\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256538 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce405643-c53c-472d-802c-a3d8fe7840a0-config\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256589 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce405643-c53c-472d-802c-a3d8fe7840a0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.256722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.257048 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.258436 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ce405643-c53c-472d-802c-a3d8fe7840a0-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.259045 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce405643-c53c-472d-802c-a3d8fe7840a0-config\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.261506 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ce405643-c53c-472d-802c-a3d8fe7840a0-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.264191 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.266480 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.266650 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ce405643-c53c-472d-802c-a3d8fe7840a0-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.276965 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6dns\" (UniqueName: \"kubernetes.io/projected/ce405643-c53c-472d-802c-a3d8fe7840a0-kube-api-access-v6dns\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.316804 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ce405643-c53c-472d-802c-a3d8fe7840a0\") " pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: E0930 17:32:46.351727 4778 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Sep 30 17:32:46 crc kubenswrapper[4778]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/567d6f34-762c-4560-8af1-be9c5f0a1945/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 17:32:46 crc kubenswrapper[4778]: > podSandboxID="f02bc782a91f2351e94c8a09a57c89ad9565f53e86d624e2daac26c7be01d5db" Sep 30 17:32:46 crc kubenswrapper[4778]: E0930 17:32:46.351905 4778 kuberuntime_manager.go:1274] "Unhandled Error" err=< Sep 30 17:32:46 crc kubenswrapper[4778]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7xssr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-ch8th_openstack(567d6f34-762c-4560-8af1-be9c5f0a1945): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/567d6f34-762c-4560-8af1-be9c5f0a1945/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Sep 30 17:32:46 crc kubenswrapper[4778]: > logger="UnhandledError" Sep 30 17:32:46 crc kubenswrapper[4778]: E0930 17:32:46.353187 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/567d6f34-762c-4560-8af1-be9c5f0a1945/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.375996 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.567551 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-qd6f4"] Sep 30 17:32:46 crc kubenswrapper[4778]: W0930 17:32:46.593787 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a8e6ffe_b59c_43c2_b8f2_dec2fa786df6.slice/crio-d6bf6f143149f2dd5921ba05788cb10e43e2d4ffdce1f42a4227774114278b94 WatchSource:0}: Error finding container d6bf6f143149f2dd5921ba05788cb10e43e2d4ffdce1f42a4227774114278b94: Status 404 returned error can't find the container with id d6bf6f143149f2dd5921ba05788cb10e43e2d4ffdce1f42a4227774114278b94 Sep 30 17:32:46 crc kubenswrapper[4778]: I0930 17:32:46.940210 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.154245 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ce405643-c53c-472d-802c-a3d8fe7840a0","Type":"ContainerStarted","Data":"b152d59fdbbb1a7cd1b66d9b6b24216382daaefcf31558d27b9472959c8ab65b"} Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.156252 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd6f4" event={"ID":"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6","Type":"ContainerStarted","Data":"d6bf6f143149f2dd5921ba05788cb10e43e2d4ffdce1f42a4227774114278b94"} Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.160041 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerID="676772e74ca0e39688dac489c1f1619e6e81ab7b4db71475f44ec11116cf53a4" exitCode=0 Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.161334 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" event={"ID":"7a383937-1f37-4d7c-9ddf-6aefd9fd7329","Type":"ContainerDied","Data":"676772e74ca0e39688dac489c1f1619e6e81ab7b4db71475f44ec11116cf53a4"} Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.161384 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" event={"ID":"7a383937-1f37-4d7c-9ddf-6aefd9fd7329","Type":"ContainerStarted","Data":"dbd26f2b6e1b1cde673fea2b81fd84d9bfa3c882b68a829b776cf3fdbc646a77"} Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.161748 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.182422 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" podStartSLOduration=13.772438743 podStartE2EDuration="14.182402496s" podCreationTimestamp="2025-09-30 17:32:33 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.208304758 +0000 UTC m=+904.198202561" lastFinishedPulling="2025-09-30 17:32:45.618268511 +0000 UTC m=+904.608166314" observedRunningTime="2025-09-30 17:32:47.177491096 +0000 UTC m=+906.167388919" watchObservedRunningTime="2025-09-30 17:32:47.182402496 +0000 UTC m=+906.172300299" Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.731327 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0" path="/var/lib/kubelet/pods/9f9bc6f1-8f73-4e50-b7f2-901b11c9cfb0/volumes" Sep 30 17:32:47 crc kubenswrapper[4778]: I0930 17:32:47.731695 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06" path="/var/lib/kubelet/pods/bd3b0f70-4d1c-41f2-b7a2-a19e2aeb7f06/volumes" Sep 30 17:32:49 crc kubenswrapper[4778]: I0930 17:32:49.181936 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" event={"ID":"567d6f34-762c-4560-8af1-be9c5f0a1945","Type":"ContainerStarted","Data":"08cc6e04e2dc495d1e95b89a9466f1fd15d75ada5cc538f7b9074c09b628d78d"} Sep 30 17:32:49 crc kubenswrapper[4778]: I0930 17:32:49.182975 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:49 crc kubenswrapper[4778]: I0930 17:32:49.200835 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" podStartSLOduration=15.682041906 podStartE2EDuration="16.200815582s" podCreationTimestamp="2025-09-30 17:32:33 +0000 UTC" firstStartedPulling="2025-09-30 17:32:44.747208265 +0000 UTC m=+903.737106088" lastFinishedPulling="2025-09-30 17:32:45.265981961 +0000 UTC m=+904.255879764" observedRunningTime="2025-09-30 17:32:49.199758169 +0000 UTC m=+908.189655982" watchObservedRunningTime="2025-09-30 17:32:49.200815582 +0000 UTC m=+908.190713385" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.298834 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-9cgft"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.300876 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.304661 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.310158 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9cgft"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.443029 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-6mk8g"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.444366 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.444404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-ovn-rundir\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.444427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-ovs-rundir\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.444447 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-config\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.444481 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-combined-ca-bundle\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.444502 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2vb2\" (UniqueName: \"kubernetes.io/projected/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-kube-api-access-t2vb2\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.445579 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="dnsmasq-dns" containerID="cri-o://dbd26f2b6e1b1cde673fea2b81fd84d9bfa3c882b68a829b776cf3fdbc646a77" gracePeriod=10 Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.461888 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xfrqm"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.463012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.465222 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.480761 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xfrqm"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546450 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546496 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-config\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546516 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2x2\" (UniqueName: \"kubernetes.io/projected/7e0a85c1-de10-487c-8614-250af0740399-kube-api-access-xx2x2\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546626 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546669 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-ovn-rundir\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-ovs-rundir\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546761 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-config\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546796 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-combined-ca-bundle\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2vb2\" (UniqueName: \"kubernetes.io/projected/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-kube-api-access-t2vb2\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.546970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-ovn-rundir\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.547022 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-ovs-rundir\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.547834 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-config\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.555795 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.568016 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2vb2\" (UniqueName: \"kubernetes.io/projected/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-kube-api-access-t2vb2\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.577562 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc1dcfb-4095-49c5-bafb-adc94165d6c3-combined-ca-bundle\") pod \"ovn-controller-metrics-9cgft\" (UID: \"ecc1dcfb-4095-49c5-bafb-adc94165d6c3\") " pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.594155 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ch8th"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.619098 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qx7rk"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.622338 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.624677 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.631593 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-9cgft" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.640976 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qx7rk"] Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.651215 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.651357 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.651391 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-config\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.651416 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2x2\" (UniqueName: \"kubernetes.io/projected/7e0a85c1-de10-487c-8614-250af0740399-kube-api-access-xx2x2\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.654521 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.654835 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.655178 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-config\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.690830 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2x2\" (UniqueName: \"kubernetes.io/projected/7e0a85c1-de10-487c-8614-250af0740399-kube-api-access-xx2x2\") pod \"dnsmasq-dns-7fd796d7df-xfrqm\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.753567 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.753789 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-config\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.753852 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gxf\" (UniqueName: \"kubernetes.io/projected/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-kube-api-access-m7gxf\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.753971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.754026 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.855653 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.855768 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.855799 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-config\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.855822 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gxf\" (UniqueName: \"kubernetes.io/projected/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-kube-api-access-m7gxf\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.855856 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.856700 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.856732 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.856907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-config\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.857360 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.886489 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gxf\" (UniqueName: \"kubernetes.io/projected/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-kube-api-access-m7gxf\") pod \"dnsmasq-dns-86db49b7ff-qx7rk\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.919407 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:32:50 crc kubenswrapper[4778]: I0930 17:32:50.959556 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:32:51 crc kubenswrapper[4778]: I0930 17:32:51.198771 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerID="dbd26f2b6e1b1cde673fea2b81fd84d9bfa3c882b68a829b776cf3fdbc646a77" exitCode=0 Sep 30 17:32:51 crc kubenswrapper[4778]: I0930 17:32:51.198872 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" event={"ID":"7a383937-1f37-4d7c-9ddf-6aefd9fd7329","Type":"ContainerDied","Data":"dbd26f2b6e1b1cde673fea2b81fd84d9bfa3c882b68a829b776cf3fdbc646a77"} Sep 30 17:32:51 crc kubenswrapper[4778]: I0930 17:32:51.199038 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="dnsmasq-dns" containerID="cri-o://08cc6e04e2dc495d1e95b89a9466f1fd15d75ada5cc538f7b9074c09b628d78d" gracePeriod=10 Sep 30 17:32:52 crc kubenswrapper[4778]: I0930 17:32:52.209219 4778 generic.go:334] "Generic (PLEG): container finished" podID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerID="08cc6e04e2dc495d1e95b89a9466f1fd15d75ada5cc538f7b9074c09b628d78d" exitCode=0 Sep 30 17:32:52 crc kubenswrapper[4778]: I0930 17:32:52.209298 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" event={"ID":"567d6f34-762c-4560-8af1-be9c5f0a1945","Type":"ContainerDied","Data":"08cc6e04e2dc495d1e95b89a9466f1fd15d75ada5cc538f7b9074c09b628d78d"} Sep 30 17:32:53 crc kubenswrapper[4778]: I0930 17:32:53.536829 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.99:5353: connect: connection refused" Sep 30 17:32:53 crc kubenswrapper[4778]: I0930 17:32:53.854098 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: connect: connection refused" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.349229 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.349931 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5b5h5cdh588h699hfbh5f8hch556hb8hfh588h5ffh77hc5h7dh574h99h9bh585h57fh65chbh5b4h66dhbh57ch66dh54ch697h5ffh5d8h596q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm2l5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(21244498-232a-4725-be68-da731564a70b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.351402 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="21244498-232a-4725-be68-da731564a70b" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.387185 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.387381 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:ovsdb-server-init,Image:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,Command:[/usr/local/bin/container-scripts/init-ovsdb-server.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n69h6ch5b9h668hbch5fbh64dh64ch68h9dh99h7ch567h5b5h644h7chcdh66fh679hb4h64bh89h5c9h695hdch675h5b6hb4h6h5ddh78h5cfq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-ovs,ReadOnly:false,MountPath:/etc/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-run,ReadOnly:false,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-log,ReadOnly:false,MountPath:/var/log/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:var-lib,ReadOnly:false,MountPath:/var/lib/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5n7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-ovs-qd6f4_openstack(9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.388665 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-ovs-qd6f4" podUID="9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.863963 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified" Sep 30 17:32:57 crc kubenswrapper[4778]: E0930 17:32:57.864146 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n579h8dh57fh685h5h74hdfh589h54chfhfdh5d4h584h75hbh669h5dch5c4h57h5bh5fbh9hcbh57bh557h655h86h695h545hf6h5dbhcfq,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cxcbt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(c60ce32d-deec-4c60-9192-6b090ae53773): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:32:57 crc kubenswrapper[4778]: I0930 17:32:57.943551 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.106750 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-dns-svc\") pod \"567d6f34-762c-4560-8af1-be9c5f0a1945\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.107165 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xssr\" (UniqueName: \"kubernetes.io/projected/567d6f34-762c-4560-8af1-be9c5f0a1945-kube-api-access-7xssr\") pod \"567d6f34-762c-4560-8af1-be9c5f0a1945\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.107218 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-config\") pod \"567d6f34-762c-4560-8af1-be9c5f0a1945\" (UID: \"567d6f34-762c-4560-8af1-be9c5f0a1945\") " Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.112122 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/567d6f34-762c-4560-8af1-be9c5f0a1945-kube-api-access-7xssr" (OuterVolumeSpecName: "kube-api-access-7xssr") pod "567d6f34-762c-4560-8af1-be9c5f0a1945" (UID: "567d6f34-762c-4560-8af1-be9c5f0a1945"). InnerVolumeSpecName "kube-api-access-7xssr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.142566 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-config" (OuterVolumeSpecName: "config") pod "567d6f34-762c-4560-8af1-be9c5f0a1945" (UID: "567d6f34-762c-4560-8af1-be9c5f0a1945"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.142969 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "567d6f34-762c-4560-8af1-be9c5f0a1945" (UID: "567d6f34-762c-4560-8af1-be9c5f0a1945"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.186176 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.231166 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.231216 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xssr\" (UniqueName: \"kubernetes.io/projected/567d6f34-762c-4560-8af1-be9c5f0a1945-kube-api-access-7xssr\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.231231 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/567d6f34-762c-4560-8af1-be9c5f0a1945-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.270837 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" event={"ID":"567d6f34-762c-4560-8af1-be9c5f0a1945","Type":"ContainerDied","Data":"f02bc782a91f2351e94c8a09a57c89ad9565f53e86d624e2daac26c7be01d5db"} Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.270946 4778 scope.go:117] "RemoveContainer" containerID="08cc6e04e2dc495d1e95b89a9466f1fd15d75ada5cc538f7b9074c09b628d78d" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.271157 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-ch8th" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.287815 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" event={"ID":"7a383937-1f37-4d7c-9ddf-6aefd9fd7329","Type":"ContainerDied","Data":"eabee853c961a5a5ca9636519e1501ff700e5d56d2f6cba999558fced86ff138"} Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.288064 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-6mk8g" Sep 30 17:32:58 crc kubenswrapper[4778]: E0930 17:32:58.294034 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="21244498-232a-4725-be68-da731564a70b" Sep 30 17:32:58 crc kubenswrapper[4778]: E0930 17:32:58.294127 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdb-server-init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified\\\"\"" pod="openstack/ovn-controller-ovs-qd6f4" podUID="9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.332107 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpx8n\" (UniqueName: \"kubernetes.io/projected/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-kube-api-access-zpx8n\") pod \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.332157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-dns-svc\") pod \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.332335 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-config\") pod \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\" (UID: \"7a383937-1f37-4d7c-9ddf-6aefd9fd7329\") " Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.344901 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-kube-api-access-zpx8n" (OuterVolumeSpecName: "kube-api-access-zpx8n") pod "7a383937-1f37-4d7c-9ddf-6aefd9fd7329" (UID: "7a383937-1f37-4d7c-9ddf-6aefd9fd7329"). InnerVolumeSpecName "kube-api-access-zpx8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.362888 4778 scope.go:117] "RemoveContainer" containerID="e40e1acaa8ff639ec2045f7fc3eae6ca844ace11d4da9c92dc556047d97fc158" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.389593 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ch8th"] Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.396338 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-ch8th"] Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.423560 4778 scope.go:117] "RemoveContainer" containerID="dbd26f2b6e1b1cde673fea2b81fd84d9bfa3c882b68a829b776cf3fdbc646a77" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.435442 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpx8n\" (UniqueName: \"kubernetes.io/projected/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-kube-api-access-zpx8n\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.460818 4778 scope.go:117] "RemoveContainer" containerID="676772e74ca0e39688dac489c1f1619e6e81ab7b4db71475f44ec11116cf53a4" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.577901 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a383937-1f37-4d7c-9ddf-6aefd9fd7329" (UID: "7a383937-1f37-4d7c-9ddf-6aefd9fd7329"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.594323 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-config" (OuterVolumeSpecName: "config") pod "7a383937-1f37-4d7c-9ddf-6aefd9fd7329" (UID: "7a383937-1f37-4d7c-9ddf-6aefd9fd7329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.643530 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-9cgft"] Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.646553 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.646581 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a383937-1f37-4d7c-9ddf-6aefd9fd7329-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.725052 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xfrqm"] Sep 30 17:32:58 crc kubenswrapper[4778]: W0930 17:32:58.731642 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e0a85c1_de10_487c_8614_250af0740399.slice/crio-da5bcd211bb3101553fd748e5bce7f4622ca8570a62a63c060ea9e216672a015 WatchSource:0}: Error finding container da5bcd211bb3101553fd748e5bce7f4622ca8570a62a63c060ea9e216672a015: Status 404 returned error can't find the container with id da5bcd211bb3101553fd748e5bce7f4622ca8570a62a63c060ea9e216672a015 Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.774126 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qx7rk"] Sep 30 17:32:58 crc kubenswrapper[4778]: W0930 17:32:58.786139 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb6d0d28_eda0_4390_bea4_7ba28fff79a7.slice/crio-58f0f5ffe739bb917107f52f938b8b65bfa54ddc93319a1bab3af6fbe2db83e9 WatchSource:0}: Error finding container 58f0f5ffe739bb917107f52f938b8b65bfa54ddc93319a1bab3af6fbe2db83e9: Status 404 returned error can't find the container with id 58f0f5ffe739bb917107f52f938b8b65bfa54ddc93319a1bab3af6fbe2db83e9 Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.923062 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-6mk8g"] Sep 30 17:32:58 crc kubenswrapper[4778]: I0930 17:32:58.937109 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-6mk8g"] Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.299742 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"227061d9-b3e7-4711-92f5-283ca4af1412","Type":"ContainerStarted","Data":"39ae642e9e5a90afc68250d2f463142ba3551606556210a8b25593d98603e228"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.302861 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ce405643-c53c-472d-802c-a3d8fe7840a0","Type":"ContainerStarted","Data":"a383c53a13d7d1293d7d85a542fa2a93a597507945fb87889cf5c0ec755eb75f"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.304501 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj" event={"ID":"d43aaa67-2f2b-4045-80af-3593da87ed64","Type":"ContainerStarted","Data":"ee23c330fd18f27c7ff9407d603287e9fe0ff842ae47298d6ceac3352f62a3e4"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.304754 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-79nnj" Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.307741 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9cgft" event={"ID":"ecc1dcfb-4095-49c5-bafb-adc94165d6c3","Type":"ContainerStarted","Data":"cc22a4f3faf63aa43636ddd486b0adcd0d3395895ef58d61505e818bce033073"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.309725 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9705cb32-3888-4b76-863d-7c4dd57185bc","Type":"ContainerStarted","Data":"0fa32bcdc4c9a5bb671c16a5853d705e1eeafeb2e77e648904cc1a82ff468ae7"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.314378 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02bfdd7b-c081-4815-9a55-f39fa4d0384f","Type":"ContainerStarted","Data":"9b69da6b162cd8fcea4bbcabed347f8d0c55258c4dfa607237d0a5ae5b996d9a"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.316657 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerID="26dafc618861cb8faee3114255dfa7ef13691ea3eada354c149657512fa944fc" exitCode=0 Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.316763 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" event={"ID":"bb6d0d28-eda0-4390-bea4-7ba28fff79a7","Type":"ContainerDied","Data":"26dafc618861cb8faee3114255dfa7ef13691ea3eada354c149657512fa944fc"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.316795 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" event={"ID":"bb6d0d28-eda0-4390-bea4-7ba28fff79a7","Type":"ContainerStarted","Data":"58f0f5ffe739bb917107f52f938b8b65bfa54ddc93319a1bab3af6fbe2db83e9"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.319501 4778 generic.go:334] "Generic (PLEG): container finished" podID="7e0a85c1-de10-487c-8614-250af0740399" containerID="9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91" exitCode=0 Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.319540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" event={"ID":"7e0a85c1-de10-487c-8614-250af0740399","Type":"ContainerDied","Data":"9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.319581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" event={"ID":"7e0a85c1-de10-487c-8614-250af0740399","Type":"ContainerStarted","Data":"da5bcd211bb3101553fd748e5bce7f4622ca8570a62a63c060ea9e216672a015"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.320814 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1","Type":"ContainerStarted","Data":"1e6e1df402ba14a96802eacaf1517a755fed4b9c9b882dc7b976f77bc3c679c8"} Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.424983 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-79nnj" podStartSLOduration=4.015672986 podStartE2EDuration="16.424958558s" podCreationTimestamp="2025-09-30 17:32:43 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.774386642 +0000 UTC m=+904.764284445" lastFinishedPulling="2025-09-30 17:32:58.183672214 +0000 UTC m=+917.173570017" observedRunningTime="2025-09-30 17:32:59.4161613 +0000 UTC m=+918.406059113" watchObservedRunningTime="2025-09-30 17:32:59.424958558 +0000 UTC m=+918.414856361" Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.725607 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" path="/var/lib/kubelet/pods/567d6f34-762c-4560-8af1-be9c5f0a1945/volumes" Sep 30 17:32:59 crc kubenswrapper[4778]: I0930 17:32:59.726654 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" path="/var/lib/kubelet/pods/7a383937-1f37-4d7c-9ddf-6aefd9fd7329/volumes" Sep 30 17:33:00 crc kubenswrapper[4778]: I0930 17:33:00.336072 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" event={"ID":"bb6d0d28-eda0-4390-bea4-7ba28fff79a7","Type":"ContainerStarted","Data":"2c4a9a0a37751c1633e3d86e563d0d9cc2e97d7eabaf965b40c82b1cbc9de113"} Sep 30 17:33:00 crc kubenswrapper[4778]: I0930 17:33:00.336268 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:33:00 crc kubenswrapper[4778]: I0930 17:33:00.339750 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" event={"ID":"7e0a85c1-de10-487c-8614-250af0740399","Type":"ContainerStarted","Data":"a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe"} Sep 30 17:33:00 crc kubenswrapper[4778]: I0930 17:33:00.359861 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" podStartSLOduration=10.35982742 podStartE2EDuration="10.35982742s" podCreationTimestamp="2025-09-30 17:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:33:00.357743347 +0000 UTC m=+919.347641160" watchObservedRunningTime="2025-09-30 17:33:00.35982742 +0000 UTC m=+919.349725253" Sep 30 17:33:00 crc kubenswrapper[4778]: I0930 17:33:00.919714 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:33:01 crc kubenswrapper[4778]: E0930 17:33:01.360342 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovsdbserver-nb-0" podUID="c60ce32d-deec-4c60-9192-6b090ae53773" Sep 30 17:33:01 crc kubenswrapper[4778]: I0930 17:33:01.749853 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" podStartSLOduration=11.749825425000001 podStartE2EDuration="11.749825425s" podCreationTimestamp="2025-09-30 17:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:33:00.381169961 +0000 UTC m=+919.371067844" watchObservedRunningTime="2025-09-30 17:33:01.749825425 +0000 UTC m=+920.739723268" Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.366271 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ce405643-c53c-472d-802c-a3d8fe7840a0","Type":"ContainerStarted","Data":"303e023158076f97a4008b5dedfaee6708ca7c14d0574650ee053f180ecf787f"} Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.368338 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-9cgft" event={"ID":"ecc1dcfb-4095-49c5-bafb-adc94165d6c3","Type":"ContainerStarted","Data":"4dfebacf44fb0922b82bcee7653929a35a75ef83118dc20eda04933f0816a9b2"} Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.372523 4778 generic.go:334] "Generic (PLEG): container finished" podID="9705cb32-3888-4b76-863d-7c4dd57185bc" containerID="0fa32bcdc4c9a5bb671c16a5853d705e1eeafeb2e77e648904cc1a82ff468ae7" exitCode=0 Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.372746 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9705cb32-3888-4b76-863d-7c4dd57185bc","Type":"ContainerDied","Data":"0fa32bcdc4c9a5bb671c16a5853d705e1eeafeb2e77e648904cc1a82ff468ae7"} Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.376524 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c60ce32d-deec-4c60-9192-6b090ae53773","Type":"ContainerStarted","Data":"e7a831a1290a069098b9ef78deccf6ef2c8328759246d83fecdc1c81f8ecd257"} Sep 30 17:33:02 crc kubenswrapper[4778]: E0930 17:33:02.378582 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="c60ce32d-deec-4c60-9192-6b090ae53773" Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.380915 4778 generic.go:334] "Generic (PLEG): container finished" podID="02bfdd7b-c081-4815-9a55-f39fa4d0384f" containerID="9b69da6b162cd8fcea4bbcabed347f8d0c55258c4dfa607237d0a5ae5b996d9a" exitCode=0 Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.381004 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02bfdd7b-c081-4815-9a55-f39fa4d0384f","Type":"ContainerDied","Data":"9b69da6b162cd8fcea4bbcabed347f8d0c55258c4dfa607237d0a5ae5b996d9a"} Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.407729 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=4.197446301 podStartE2EDuration="18.407706291s" podCreationTimestamp="2025-09-30 17:32:44 +0000 UTC" firstStartedPulling="2025-09-30 17:32:46.949965078 +0000 UTC m=+905.939862881" lastFinishedPulling="2025-09-30 17:33:01.160225068 +0000 UTC m=+920.150122871" observedRunningTime="2025-09-30 17:33:02.403379769 +0000 UTC m=+921.393277592" watchObservedRunningTime="2025-09-30 17:33:02.407706291 +0000 UTC m=+921.397604104" Sep 30 17:33:02 crc kubenswrapper[4778]: I0930 17:33:02.482342 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-9cgft" podStartSLOduration=10.000376573 podStartE2EDuration="12.482319513s" podCreationTimestamp="2025-09-30 17:32:50 +0000 UTC" firstStartedPulling="2025-09-30 17:32:58.664970692 +0000 UTC m=+917.654868495" lastFinishedPulling="2025-09-30 17:33:01.146913632 +0000 UTC m=+920.136811435" observedRunningTime="2025-09-30 17:33:02.478081014 +0000 UTC m=+921.467978847" watchObservedRunningTime="2025-09-30 17:33:02.482319513 +0000 UTC m=+921.472217316" Sep 30 17:33:03 crc kubenswrapper[4778]: I0930 17:33:03.396995 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"9705cb32-3888-4b76-863d-7c4dd57185bc","Type":"ContainerStarted","Data":"14dbe97587ecbbdc9d947a5b1c6cd47ab61912e341950c86bdfabcac11e180aa"} Sep 30 17:33:03 crc kubenswrapper[4778]: I0930 17:33:03.404191 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"02bfdd7b-c081-4815-9a55-f39fa4d0384f","Type":"ContainerStarted","Data":"25e79376669db4076f3da1a53f4af71f7cafca6e73335f34d644bb7ee2757445"} Sep 30 17:33:03 crc kubenswrapper[4778]: E0930 17:33:03.408258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified\\\"\"" pod="openstack/ovsdbserver-nb-0" podUID="c60ce32d-deec-4c60-9192-6b090ae53773" Sep 30 17:33:03 crc kubenswrapper[4778]: I0930 17:33:03.431167 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=15.034788467 podStartE2EDuration="27.431128111s" podCreationTimestamp="2025-09-30 17:32:36 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.756251773 +0000 UTC m=+904.746149576" lastFinishedPulling="2025-09-30 17:32:58.152591407 +0000 UTC m=+917.142489220" observedRunningTime="2025-09-30 17:33:03.423482478 +0000 UTC m=+922.413380331" watchObservedRunningTime="2025-09-30 17:33:03.431128111 +0000 UTC m=+922.421025944" Sep 30 17:33:03 crc kubenswrapper[4778]: I0930 17:33:03.465379 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=14.635735922 podStartE2EDuration="27.465355113s" podCreationTimestamp="2025-09-30 17:32:36 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.369494553 +0000 UTC m=+904.359392356" lastFinishedPulling="2025-09-30 17:32:58.199113724 +0000 UTC m=+917.189011547" observedRunningTime="2025-09-30 17:33:03.462427134 +0000 UTC m=+922.452324947" watchObservedRunningTime="2025-09-30 17:33:03.465355113 +0000 UTC m=+922.455252936" Sep 30 17:33:04 crc kubenswrapper[4778]: I0930 17:33:04.377065 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 17:33:04 crc kubenswrapper[4778]: I0930 17:33:04.444175 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 17:33:05 crc kubenswrapper[4778]: I0930 17:33:05.419867 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 17:33:05 crc kubenswrapper[4778]: I0930 17:33:05.478808 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 17:33:05 crc kubenswrapper[4778]: I0930 17:33:05.921962 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:33:05 crc kubenswrapper[4778]: I0930 17:33:05.961733 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:33:06 crc kubenswrapper[4778]: I0930 17:33:06.034062 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xfrqm"] Sep 30 17:33:06 crc kubenswrapper[4778]: I0930 17:33:06.429763 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" podUID="7e0a85c1-de10-487c-8614-250af0740399" containerName="dnsmasq-dns" containerID="cri-o://a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe" gracePeriod=10 Sep 30 17:33:06 crc kubenswrapper[4778]: I0930 17:33:06.932266 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.108415 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-dns-svc\") pod \"7e0a85c1-de10-487c-8614-250af0740399\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.108942 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-config\") pod \"7e0a85c1-de10-487c-8614-250af0740399\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.108989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-ovsdbserver-nb\") pod \"7e0a85c1-de10-487c-8614-250af0740399\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.109109 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx2x2\" (UniqueName: \"kubernetes.io/projected/7e0a85c1-de10-487c-8614-250af0740399-kube-api-access-xx2x2\") pod \"7e0a85c1-de10-487c-8614-250af0740399\" (UID: \"7e0a85c1-de10-487c-8614-250af0740399\") " Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.114971 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0a85c1-de10-487c-8614-250af0740399-kube-api-access-xx2x2" (OuterVolumeSpecName: "kube-api-access-xx2x2") pod "7e0a85c1-de10-487c-8614-250af0740399" (UID: "7e0a85c1-de10-487c-8614-250af0740399"). InnerVolumeSpecName "kube-api-access-xx2x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.146132 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7e0a85c1-de10-487c-8614-250af0740399" (UID: "7e0a85c1-de10-487c-8614-250af0740399"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.157340 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7e0a85c1-de10-487c-8614-250af0740399" (UID: "7e0a85c1-de10-487c-8614-250af0740399"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.162709 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-config" (OuterVolumeSpecName: "config") pod "7e0a85c1-de10-487c-8614-250af0740399" (UID: "7e0a85c1-de10-487c-8614-250af0740399"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.211089 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx2x2\" (UniqueName: \"kubernetes.io/projected/7e0a85c1-de10-487c-8614-250af0740399-kube-api-access-xx2x2\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.211123 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.211132 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.211140 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7e0a85c1-de10-487c-8614-250af0740399-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.443026 4778 generic.go:334] "Generic (PLEG): container finished" podID="7e0a85c1-de10-487c-8614-250af0740399" containerID="a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe" exitCode=0 Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.443067 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.443126 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" event={"ID":"7e0a85c1-de10-487c-8614-250af0740399","Type":"ContainerDied","Data":"a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe"} Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.443206 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-xfrqm" event={"ID":"7e0a85c1-de10-487c-8614-250af0740399","Type":"ContainerDied","Data":"da5bcd211bb3101553fd748e5bce7f4622ca8570a62a63c060ea9e216672a015"} Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.443239 4778 scope.go:117] "RemoveContainer" containerID="a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.501488 4778 scope.go:117] "RemoveContainer" containerID="9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.501762 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.501871 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.505235 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xfrqm"] Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.511732 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-xfrqm"] Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.519509 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.519550 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.525694 4778 scope.go:117] "RemoveContainer" containerID="a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe" Sep 30 17:33:07 crc kubenswrapper[4778]: E0930 17:33:07.541222 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe\": container with ID starting with a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe not found: ID does not exist" containerID="a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.541259 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe"} err="failed to get container status \"a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe\": rpc error: code = NotFound desc = could not find container \"a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe\": container with ID starting with a163da67d2e87ff4381d38d39354a105e432a59f3dcbb42faed3d9d528208bfe not found: ID does not exist" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.541284 4778 scope.go:117] "RemoveContainer" containerID="9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91" Sep 30 17:33:07 crc kubenswrapper[4778]: E0930 17:33:07.541713 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91\": container with ID starting with 9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91 not found: ID does not exist" containerID="9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.541744 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91"} err="failed to get container status \"9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91\": rpc error: code = NotFound desc = could not find container \"9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91\": container with ID starting with 9681aa7e6415c27aa18fad52c4df9045ada8b1113219bda929fb170f0b035e91 not found: ID does not exist" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.573826 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 17:33:07 crc kubenswrapper[4778]: I0930 17:33:07.728987 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0a85c1-de10-487c-8614-250af0740399" path="/var/lib/kubelet/pods/7e0a85c1-de10-487c-8614-250af0740399/volumes" Sep 30 17:33:08 crc kubenswrapper[4778]: I0930 17:33:08.513767 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 17:33:09 crc kubenswrapper[4778]: I0930 17:33:09.570929 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 17:33:09 crc kubenswrapper[4778]: I0930 17:33:09.623047 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 17:33:10 crc kubenswrapper[4778]: I0930 17:33:10.479887 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd6f4" event={"ID":"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6","Type":"ContainerStarted","Data":"90914d8fe142e67acaad88688222e0205a0dce6fda36659da2463d73af8cb126"} Sep 30 17:33:11 crc kubenswrapper[4778]: I0930 17:33:11.490180 4778 generic.go:334] "Generic (PLEG): container finished" podID="9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6" containerID="90914d8fe142e67acaad88688222e0205a0dce6fda36659da2463d73af8cb126" exitCode=0 Sep 30 17:33:11 crc kubenswrapper[4778]: I0930 17:33:11.490237 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd6f4" event={"ID":"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6","Type":"ContainerDied","Data":"90914d8fe142e67acaad88688222e0205a0dce6fda36659da2463d73af8cb126"} Sep 30 17:33:12 crc kubenswrapper[4778]: I0930 17:33:12.507080 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd6f4" event={"ID":"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6","Type":"ContainerStarted","Data":"571d1203928ec5596b8ea661aeab31f93206d0aaf9ec74373a287d9c8122ac5c"} Sep 30 17:33:12 crc kubenswrapper[4778]: I0930 17:33:12.507671 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-qd6f4" event={"ID":"9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6","Type":"ContainerStarted","Data":"1b78a17842f637069418050a6e28931c31512b397cec789c87aa5a441f4e4712"} Sep 30 17:33:12 crc kubenswrapper[4778]: I0930 17:33:12.508206 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:33:12 crc kubenswrapper[4778]: I0930 17:33:12.508265 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:33:12 crc kubenswrapper[4778]: I0930 17:33:12.533408 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-qd6f4" podStartSLOduration=6.016628177 podStartE2EDuration="29.533384789s" podCreationTimestamp="2025-09-30 17:32:43 +0000 UTC" firstStartedPulling="2025-09-30 17:32:46.596749417 +0000 UTC m=+905.586647220" lastFinishedPulling="2025-09-30 17:33:10.113506019 +0000 UTC m=+929.103403832" observedRunningTime="2025-09-30 17:33:12.532000317 +0000 UTC m=+931.521898170" watchObservedRunningTime="2025-09-30 17:33:12.533384789 +0000 UTC m=+931.523282622" Sep 30 17:33:14 crc kubenswrapper[4778]: I0930 17:33:14.530933 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"21244498-232a-4725-be68-da731564a70b","Type":"ContainerStarted","Data":"3eefe09f6e0df3ada748020504bd080395a167367ac042c3760780b170a75bd8"} Sep 30 17:33:14 crc kubenswrapper[4778]: I0930 17:33:14.532040 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 17:33:14 crc kubenswrapper[4778]: I0930 17:33:14.564963 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=9.227394867 podStartE2EDuration="37.564924871s" podCreationTimestamp="2025-09-30 17:32:37 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.813676438 +0000 UTC m=+904.803574241" lastFinishedPulling="2025-09-30 17:33:14.151206402 +0000 UTC m=+933.141104245" observedRunningTime="2025-09-30 17:33:14.558885497 +0000 UTC m=+933.548783310" watchObservedRunningTime="2025-09-30 17:33:14.564924871 +0000 UTC m=+933.554822674" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.699107 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-gtlnx"] Sep 30 17:33:17 crc kubenswrapper[4778]: E0930 17:33:17.700142 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="init" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700170 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="init" Sep 30 17:33:17 crc kubenswrapper[4778]: E0930 17:33:17.700201 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700214 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: E0930 17:33:17.700247 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="init" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700263 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="init" Sep 30 17:33:17 crc kubenswrapper[4778]: E0930 17:33:17.700282 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0a85c1-de10-487c-8614-250af0740399" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700295 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0a85c1-de10-487c-8614-250af0740399" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: E0930 17:33:17.700313 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0a85c1-de10-487c-8614-250af0740399" containerName="init" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700325 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0a85c1-de10-487c-8614-250af0740399" containerName="init" Sep 30 17:33:17 crc kubenswrapper[4778]: E0930 17:33:17.700351 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700364 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700716 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a383937-1f37-4d7c-9ddf-6aefd9fd7329" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700754 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0a85c1-de10-487c-8614-250af0740399" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.700769 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="567d6f34-762c-4560-8af1-be9c5f0a1945" containerName="dnsmasq-dns" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.701651 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.756478 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gtlnx"] Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.830209 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9xpg\" (UniqueName: \"kubernetes.io/projected/ecf58537-7743-4ba0-b78a-2286a7f55c4c-kube-api-access-r9xpg\") pod \"keystone-db-create-gtlnx\" (UID: \"ecf58537-7743-4ba0-b78a-2286a7f55c4c\") " pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.900562 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-kfwtc"] Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.902345 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.911223 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kfwtc"] Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.932112 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9xpg\" (UniqueName: \"kubernetes.io/projected/ecf58537-7743-4ba0-b78a-2286a7f55c4c-kube-api-access-r9xpg\") pod \"keystone-db-create-gtlnx\" (UID: \"ecf58537-7743-4ba0-b78a-2286a7f55c4c\") " pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:17 crc kubenswrapper[4778]: I0930 17:33:17.960784 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9xpg\" (UniqueName: \"kubernetes.io/projected/ecf58537-7743-4ba0-b78a-2286a7f55c4c-kube-api-access-r9xpg\") pod \"keystone-db-create-gtlnx\" (UID: \"ecf58537-7743-4ba0-b78a-2286a7f55c4c\") " pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.034242 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4926f\" (UniqueName: \"kubernetes.io/projected/58ce14f2-48e1-4f93-ab82-0a35af3a4e99-kube-api-access-4926f\") pod \"placement-db-create-kfwtc\" (UID: \"58ce14f2-48e1-4f93-ab82-0a35af3a4e99\") " pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.037430 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.136438 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4926f\" (UniqueName: \"kubernetes.io/projected/58ce14f2-48e1-4f93-ab82-0a35af3a4e99-kube-api-access-4926f\") pod \"placement-db-create-kfwtc\" (UID: \"58ce14f2-48e1-4f93-ab82-0a35af3a4e99\") " pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.156772 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4926f\" (UniqueName: \"kubernetes.io/projected/58ce14f2-48e1-4f93-ab82-0a35af3a4e99-kube-api-access-4926f\") pod \"placement-db-create-kfwtc\" (UID: \"58ce14f2-48e1-4f93-ab82-0a35af3a4e99\") " pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.221469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.572516 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-gtlnx"] Sep 30 17:33:18 crc kubenswrapper[4778]: W0930 17:33:18.576084 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecf58537_7743_4ba0_b78a_2286a7f55c4c.slice/crio-ebcab17245b6b7b0c3d32c12f5aa3be122ef380ee4ec9016264e5e8b97c630c4 WatchSource:0}: Error finding container ebcab17245b6b7b0c3d32c12f5aa3be122ef380ee4ec9016264e5e8b97c630c4: Status 404 returned error can't find the container with id ebcab17245b6b7b0c3d32c12f5aa3be122ef380ee4ec9016264e5e8b97c630c4 Sep 30 17:33:18 crc kubenswrapper[4778]: I0930 17:33:18.673044 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-kfwtc"] Sep 30 17:33:18 crc kubenswrapper[4778]: W0930 17:33:18.678679 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58ce14f2_48e1_4f93_ab82_0a35af3a4e99.slice/crio-0ab48d21b3b8a4fbf01983cf224dda654eed594306781c38f7b70e7e8881d910 WatchSource:0}: Error finding container 0ab48d21b3b8a4fbf01983cf224dda654eed594306781c38f7b70e7e8881d910: Status 404 returned error can't find the container with id 0ab48d21b3b8a4fbf01983cf224dda654eed594306781c38f7b70e7e8881d910 Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.592322 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"c60ce32d-deec-4c60-9192-6b090ae53773","Type":"ContainerStarted","Data":"21907d46898365c749b9e8d539254b8f95cd6c2b3c3515471b30756b8b847037"} Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.596336 4778 generic.go:334] "Generic (PLEG): container finished" podID="58ce14f2-48e1-4f93-ab82-0a35af3a4e99" containerID="b6a36cc0389c35e552f3e26ab13b21d115aac0b76a03ebdc379e7ee64e3e9c7d" exitCode=0 Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.596687 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kfwtc" event={"ID":"58ce14f2-48e1-4f93-ab82-0a35af3a4e99","Type":"ContainerDied","Data":"b6a36cc0389c35e552f3e26ab13b21d115aac0b76a03ebdc379e7ee64e3e9c7d"} Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.596824 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kfwtc" event={"ID":"58ce14f2-48e1-4f93-ab82-0a35af3a4e99","Type":"ContainerStarted","Data":"0ab48d21b3b8a4fbf01983cf224dda654eed594306781c38f7b70e7e8881d910"} Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.598922 4778 generic.go:334] "Generic (PLEG): container finished" podID="ecf58537-7743-4ba0-b78a-2286a7f55c4c" containerID="ed4e2bbff3d0e1e73a3305bf44e27ea16089afd45fa23d87649fe8e2688703ec" exitCode=0 Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.598972 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gtlnx" event={"ID":"ecf58537-7743-4ba0-b78a-2286a7f55c4c","Type":"ContainerDied","Data":"ed4e2bbff3d0e1e73a3305bf44e27ea16089afd45fa23d87649fe8e2688703ec"} Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.599000 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gtlnx" event={"ID":"ecf58537-7743-4ba0-b78a-2286a7f55c4c","Type":"ContainerStarted","Data":"ebcab17245b6b7b0c3d32c12f5aa3be122ef380ee4ec9016264e5e8b97c630c4"} Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.616791 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.18582676 podStartE2EDuration="36.616767691s" podCreationTimestamp="2025-09-30 17:32:43 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.916801116 +0000 UTC m=+904.906698929" lastFinishedPulling="2025-09-30 17:33:19.347742037 +0000 UTC m=+938.337639860" observedRunningTime="2025-09-30 17:33:19.610450649 +0000 UTC m=+938.600348452" watchObservedRunningTime="2025-09-30 17:33:19.616767691 +0000 UTC m=+938.606665504" Sep 30 17:33:19 crc kubenswrapper[4778]: I0930 17:33:19.770166 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 17:33:20 crc kubenswrapper[4778]: I0930 17:33:20.770145 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.039901 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.047030 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.190574 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9xpg\" (UniqueName: \"kubernetes.io/projected/ecf58537-7743-4ba0-b78a-2286a7f55c4c-kube-api-access-r9xpg\") pod \"ecf58537-7743-4ba0-b78a-2286a7f55c4c\" (UID: \"ecf58537-7743-4ba0-b78a-2286a7f55c4c\") " Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.190719 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4926f\" (UniqueName: \"kubernetes.io/projected/58ce14f2-48e1-4f93-ab82-0a35af3a4e99-kube-api-access-4926f\") pod \"58ce14f2-48e1-4f93-ab82-0a35af3a4e99\" (UID: \"58ce14f2-48e1-4f93-ab82-0a35af3a4e99\") " Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.198181 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58ce14f2-48e1-4f93-ab82-0a35af3a4e99-kube-api-access-4926f" (OuterVolumeSpecName: "kube-api-access-4926f") pod "58ce14f2-48e1-4f93-ab82-0a35af3a4e99" (UID: "58ce14f2-48e1-4f93-ab82-0a35af3a4e99"). InnerVolumeSpecName "kube-api-access-4926f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.199150 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecf58537-7743-4ba0-b78a-2286a7f55c4c-kube-api-access-r9xpg" (OuterVolumeSpecName: "kube-api-access-r9xpg") pod "ecf58537-7743-4ba0-b78a-2286a7f55c4c" (UID: "ecf58537-7743-4ba0-b78a-2286a7f55c4c"). InnerVolumeSpecName "kube-api-access-r9xpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.292611 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9xpg\" (UniqueName: \"kubernetes.io/projected/ecf58537-7743-4ba0-b78a-2286a7f55c4c-kube-api-access-r9xpg\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.292707 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4926f\" (UniqueName: \"kubernetes.io/projected/58ce14f2-48e1-4f93-ab82-0a35af3a4e99-kube-api-access-4926f\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.617869 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-gtlnx" event={"ID":"ecf58537-7743-4ba0-b78a-2286a7f55c4c","Type":"ContainerDied","Data":"ebcab17245b6b7b0c3d32c12f5aa3be122ef380ee4ec9016264e5e8b97c630c4"} Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.618315 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebcab17245b6b7b0c3d32c12f5aa3be122ef380ee4ec9016264e5e8b97c630c4" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.617927 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-gtlnx" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.620806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-kfwtc" Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.620828 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-kfwtc" event={"ID":"58ce14f2-48e1-4f93-ab82-0a35af3a4e99","Type":"ContainerDied","Data":"0ab48d21b3b8a4fbf01983cf224dda654eed594306781c38f7b70e7e8881d910"} Sep 30 17:33:21 crc kubenswrapper[4778]: I0930 17:33:21.620876 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab48d21b3b8a4fbf01983cf224dda654eed594306781c38f7b70e7e8881d910" Sep 30 17:33:22 crc kubenswrapper[4778]: I0930 17:33:22.998762 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.217506 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vqn7w"] Sep 30 17:33:23 crc kubenswrapper[4778]: E0930 17:33:23.218038 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58ce14f2-48e1-4f93-ab82-0a35af3a4e99" containerName="mariadb-database-create" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.218066 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="58ce14f2-48e1-4f93-ab82-0a35af3a4e99" containerName="mariadb-database-create" Sep 30 17:33:23 crc kubenswrapper[4778]: E0930 17:33:23.218093 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecf58537-7743-4ba0-b78a-2286a7f55c4c" containerName="mariadb-database-create" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.218105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecf58537-7743-4ba0-b78a-2286a7f55c4c" containerName="mariadb-database-create" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.218329 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecf58537-7743-4ba0-b78a-2286a7f55c4c" containerName="mariadb-database-create" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.218382 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="58ce14f2-48e1-4f93-ab82-0a35af3a4e99" containerName="mariadb-database-create" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.219161 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.239847 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vqn7w"] Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.333900 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6lml\" (UniqueName: \"kubernetes.io/projected/9cae8b89-9230-48a1-9cdd-c604911a37f8-kube-api-access-v6lml\") pod \"glance-db-create-vqn7w\" (UID: \"9cae8b89-9230-48a1-9cdd-c604911a37f8\") " pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.435336 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6lml\" (UniqueName: \"kubernetes.io/projected/9cae8b89-9230-48a1-9cdd-c604911a37f8-kube-api-access-v6lml\") pod \"glance-db-create-vqn7w\" (UID: \"9cae8b89-9230-48a1-9cdd-c604911a37f8\") " pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.469719 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6lml\" (UniqueName: \"kubernetes.io/projected/9cae8b89-9230-48a1-9cdd-c604911a37f8-kube-api-access-v6lml\") pod \"glance-db-create-vqn7w\" (UID: \"9cae8b89-9230-48a1-9cdd-c604911a37f8\") " pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.555175 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:23 crc kubenswrapper[4778]: I0930 17:33:23.831898 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 17:33:24 crc kubenswrapper[4778]: I0930 17:33:24.071422 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vqn7w"] Sep 30 17:33:24 crc kubenswrapper[4778]: W0930 17:33:24.079847 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cae8b89_9230_48a1_9cdd_c604911a37f8.slice/crio-332881aa6011821af5fc9c0cbfca1ff4aa5bb06f226b280c4e7ba1b6e64b641c WatchSource:0}: Error finding container 332881aa6011821af5fc9c0cbfca1ff4aa5bb06f226b280c4e7ba1b6e64b641c: Status 404 returned error can't find the container with id 332881aa6011821af5fc9c0cbfca1ff4aa5bb06f226b280c4e7ba1b6e64b641c Sep 30 17:33:24 crc kubenswrapper[4778]: I0930 17:33:24.663123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqn7w" event={"ID":"9cae8b89-9230-48a1-9cdd-c604911a37f8","Type":"ContainerStarted","Data":"332881aa6011821af5fc9c0cbfca1ff4aa5bb06f226b280c4e7ba1b6e64b641c"} Sep 30 17:33:24 crc kubenswrapper[4778]: I0930 17:33:24.863452 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.041915 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.044746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.048260 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-lc6qq" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.050976 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.051188 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.052639 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.052762 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.163546 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-scripts\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.163607 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.163679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.163709 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.163728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.163987 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bn4z\" (UniqueName: \"kubernetes.io/projected/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-kube-api-access-5bn4z\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.164051 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-config\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266033 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bn4z\" (UniqueName: \"kubernetes.io/projected/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-kube-api-access-5bn4z\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-config\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266215 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-scripts\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266284 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266418 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266511 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.266571 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.267388 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-config\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.267740 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.269039 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-scripts\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.274711 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.274907 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.282299 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.288862 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bn4z\" (UniqueName: \"kubernetes.io/projected/2e131ac5-4895-42fa-b7f5-6a37d5eafe3c-kube-api-access-5bn4z\") pod \"ovn-northd-0\" (UID: \"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c\") " pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.367375 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.684091 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqn7w" event={"ID":"9cae8b89-9230-48a1-9cdd-c604911a37f8","Type":"ContainerStarted","Data":"732f1d8ff0a3c72e429a3d061e666cdc934cb83c356945dc4d07abce0527e2c9"} Sep 30 17:33:25 crc kubenswrapper[4778]: I0930 17:33:25.689138 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 17:33:25 crc kubenswrapper[4778]: W0930 17:33:25.689790 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2e131ac5_4895_42fa_b7f5_6a37d5eafe3c.slice/crio-14728c51657eaebc8c90f402a800daa2db9c7bb516a1952b2daead545e0a496f WatchSource:0}: Error finding container 14728c51657eaebc8c90f402a800daa2db9c7bb516a1952b2daead545e0a496f: Status 404 returned error can't find the container with id 14728c51657eaebc8c90f402a800daa2db9c7bb516a1952b2daead545e0a496f Sep 30 17:33:26 crc kubenswrapper[4778]: I0930 17:33:26.697046 4778 generic.go:334] "Generic (PLEG): container finished" podID="9cae8b89-9230-48a1-9cdd-c604911a37f8" containerID="732f1d8ff0a3c72e429a3d061e666cdc934cb83c356945dc4d07abce0527e2c9" exitCode=0 Sep 30 17:33:26 crc kubenswrapper[4778]: I0930 17:33:26.697177 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqn7w" event={"ID":"9cae8b89-9230-48a1-9cdd-c604911a37f8","Type":"ContainerDied","Data":"732f1d8ff0a3c72e429a3d061e666cdc934cb83c356945dc4d07abce0527e2c9"} Sep 30 17:33:26 crc kubenswrapper[4778]: I0930 17:33:26.700104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c","Type":"ContainerStarted","Data":"14728c51657eaebc8c90f402a800daa2db9c7bb516a1952b2daead545e0a496f"} Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.712855 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c","Type":"ContainerStarted","Data":"48a2bfa0ba3fbc30fcfdf12dd66ad563d70fbf15c71e7bd91a7fdbd174a0b5b1"} Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.713268 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"2e131ac5-4895-42fa-b7f5-6a37d5eafe3c","Type":"ContainerStarted","Data":"0b753b70a20e52a5e3c55e9b396e26a5772d03a2d161abf2667930ae961aa76f"} Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.767542 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e75d-account-create-ntsr9"] Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.770514 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.773512 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.779714 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.907591271 podStartE2EDuration="2.779694251s" podCreationTimestamp="2025-09-30 17:33:25 +0000 UTC" firstStartedPulling="2025-09-30 17:33:25.691587366 +0000 UTC m=+944.681485189" lastFinishedPulling="2025-09-30 17:33:26.563690366 +0000 UTC m=+945.553588169" observedRunningTime="2025-09-30 17:33:27.750959495 +0000 UTC m=+946.740857358" watchObservedRunningTime="2025-09-30 17:33:27.779694251 +0000 UTC m=+946.769592064" Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.788415 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e75d-account-create-ntsr9"] Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.824929 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q95gm\" (UniqueName: \"kubernetes.io/projected/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd-kube-api-access-q95gm\") pod \"keystone-e75d-account-create-ntsr9\" (UID: \"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd\") " pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.926535 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q95gm\" (UniqueName: \"kubernetes.io/projected/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd-kube-api-access-q95gm\") pod \"keystone-e75d-account-create-ntsr9\" (UID: \"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd\") " pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:27 crc kubenswrapper[4778]: I0930 17:33:27.946528 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q95gm\" (UniqueName: \"kubernetes.io/projected/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd-kube-api-access-q95gm\") pod \"keystone-e75d-account-create-ntsr9\" (UID: \"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd\") " pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.039697 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b3b-account-create-tkcf4"] Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.044379 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.047286 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.054087 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b3b-account-create-tkcf4"] Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.075354 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.093569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.129329 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6lml\" (UniqueName: \"kubernetes.io/projected/9cae8b89-9230-48a1-9cdd-c604911a37f8-kube-api-access-v6lml\") pod \"9cae8b89-9230-48a1-9cdd-c604911a37f8\" (UID: \"9cae8b89-9230-48a1-9cdd-c604911a37f8\") " Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.129686 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frjzk\" (UniqueName: \"kubernetes.io/projected/f0f118d4-4773-4b15-9a0a-ad44742e1789-kube-api-access-frjzk\") pod \"placement-6b3b-account-create-tkcf4\" (UID: \"f0f118d4-4773-4b15-9a0a-ad44742e1789\") " pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.133130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cae8b89-9230-48a1-9cdd-c604911a37f8-kube-api-access-v6lml" (OuterVolumeSpecName: "kube-api-access-v6lml") pod "9cae8b89-9230-48a1-9cdd-c604911a37f8" (UID: "9cae8b89-9230-48a1-9cdd-c604911a37f8"). InnerVolumeSpecName "kube-api-access-v6lml". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.231293 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frjzk\" (UniqueName: \"kubernetes.io/projected/f0f118d4-4773-4b15-9a0a-ad44742e1789-kube-api-access-frjzk\") pod \"placement-6b3b-account-create-tkcf4\" (UID: \"f0f118d4-4773-4b15-9a0a-ad44742e1789\") " pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.231433 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6lml\" (UniqueName: \"kubernetes.io/projected/9cae8b89-9230-48a1-9cdd-c604911a37f8-kube-api-access-v6lml\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.253559 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frjzk\" (UniqueName: \"kubernetes.io/projected/f0f118d4-4773-4b15-9a0a-ad44742e1789-kube-api-access-frjzk\") pod \"placement-6b3b-account-create-tkcf4\" (UID: \"f0f118d4-4773-4b15-9a0a-ad44742e1789\") " pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.369540 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e75d-account-create-ntsr9"] Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.386767 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.722846 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vqn7w" event={"ID":"9cae8b89-9230-48a1-9cdd-c604911a37f8","Type":"ContainerDied","Data":"332881aa6011821af5fc9c0cbfca1ff4aa5bb06f226b280c4e7ba1b6e64b641c"} Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.723194 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vqn7w" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.723252 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="332881aa6011821af5fc9c0cbfca1ff4aa5bb06f226b280c4e7ba1b6e64b641c" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.726729 4778 generic.go:334] "Generic (PLEG): container finished" podID="a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd" containerID="a208f50b480873a95044cd8c3b446c771723ff42dfd34ca6bb4bf2a09f7d0673" exitCode=0 Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.727945 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e75d-account-create-ntsr9" event={"ID":"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd","Type":"ContainerDied","Data":"a208f50b480873a95044cd8c3b446c771723ff42dfd34ca6bb4bf2a09f7d0673"} Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.727980 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e75d-account-create-ntsr9" event={"ID":"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd","Type":"ContainerStarted","Data":"f83ec5611425b2ca11de523a0f746a2ebb7cc0b50ad4c8577f212b41a9729154"} Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.728003 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.841443 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b3b-account-create-tkcf4"] Sep 30 17:33:28 crc kubenswrapper[4778]: I0930 17:33:28.960295 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-79nnj" podUID="d43aaa67-2f2b-4045-80af-3593da87ed64" containerName="ovn-controller" probeResult="failure" output=< Sep 30 17:33:28 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 17:33:28 crc kubenswrapper[4778]: > Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.736426 4778 generic.go:334] "Generic (PLEG): container finished" podID="a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1" containerID="1e6e1df402ba14a96802eacaf1517a755fed4b9c9b882dc7b976f77bc3c679c8" exitCode=0 Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.736529 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1","Type":"ContainerDied","Data":"1e6e1df402ba14a96802eacaf1517a755fed4b9c9b882dc7b976f77bc3c679c8"} Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.739258 4778 generic.go:334] "Generic (PLEG): container finished" podID="227061d9-b3e7-4711-92f5-283ca4af1412" containerID="39ae642e9e5a90afc68250d2f463142ba3551606556210a8b25593d98603e228" exitCode=0 Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.739319 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"227061d9-b3e7-4711-92f5-283ca4af1412","Type":"ContainerDied","Data":"39ae642e9e5a90afc68250d2f463142ba3551606556210a8b25593d98603e228"} Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.743782 4778 generic.go:334] "Generic (PLEG): container finished" podID="f0f118d4-4773-4b15-9a0a-ad44742e1789" containerID="66a07cca25e1471cfffb95e97d9fca21966fbbd1e9c049eaa3448c648b688da0" exitCode=0 Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.743888 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b3b-account-create-tkcf4" event={"ID":"f0f118d4-4773-4b15-9a0a-ad44742e1789","Type":"ContainerDied","Data":"66a07cca25e1471cfffb95e97d9fca21966fbbd1e9c049eaa3448c648b688da0"} Sep 30 17:33:29 crc kubenswrapper[4778]: I0930 17:33:29.743961 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b3b-account-create-tkcf4" event={"ID":"f0f118d4-4773-4b15-9a0a-ad44742e1789","Type":"ContainerStarted","Data":"876d3751ec8b2712050298391332134f49a77a1d345896b5cf76f3df1357173b"} Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.048320 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.176876 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q95gm\" (UniqueName: \"kubernetes.io/projected/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd-kube-api-access-q95gm\") pod \"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd\" (UID: \"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd\") " Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.182999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd-kube-api-access-q95gm" (OuterVolumeSpecName: "kube-api-access-q95gm") pod "a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd" (UID: "a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd"). InnerVolumeSpecName "kube-api-access-q95gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.279199 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q95gm\" (UniqueName: \"kubernetes.io/projected/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd-kube-api-access-q95gm\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.759461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e75d-account-create-ntsr9" event={"ID":"a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd","Type":"ContainerDied","Data":"f83ec5611425b2ca11de523a0f746a2ebb7cc0b50ad4c8577f212b41a9729154"} Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.759490 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e75d-account-create-ntsr9" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.759534 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f83ec5611425b2ca11de523a0f746a2ebb7cc0b50ad4c8577f212b41a9729154" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.762471 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1","Type":"ContainerStarted","Data":"c68df07e802ae61df00d4137b764bc1c241b968ec9f6932f1ac4e2c1fbda6293"} Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.763311 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.765580 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"227061d9-b3e7-4711-92f5-283ca4af1412","Type":"ContainerStarted","Data":"fd040ed00c1d4fb097320678064b473321d4c8bff4f37c174d5a036c037b69c5"} Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.765990 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.814719 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.147522159 podStartE2EDuration="57.814700125s" podCreationTimestamp="2025-09-30 17:32:33 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.210402986 +0000 UTC m=+904.200300789" lastFinishedPulling="2025-09-30 17:32:57.877580952 +0000 UTC m=+916.867478755" observedRunningTime="2025-09-30 17:33:30.799014277 +0000 UTC m=+949.788912110" watchObservedRunningTime="2025-09-30 17:33:30.814700125 +0000 UTC m=+949.804597928" Sep 30 17:33:30 crc kubenswrapper[4778]: I0930 17:33:30.832163 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.004935835 podStartE2EDuration="57.832146547s" podCreationTimestamp="2025-09-30 17:32:33 +0000 UTC" firstStartedPulling="2025-09-30 17:32:45.367269581 +0000 UTC m=+904.357167374" lastFinishedPulling="2025-09-30 17:32:58.194480283 +0000 UTC m=+917.184378086" observedRunningTime="2025-09-30 17:33:30.829116754 +0000 UTC m=+949.819014567" watchObservedRunningTime="2025-09-30 17:33:30.832146547 +0000 UTC m=+949.822044350" Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.104983 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.193658 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frjzk\" (UniqueName: \"kubernetes.io/projected/f0f118d4-4773-4b15-9a0a-ad44742e1789-kube-api-access-frjzk\") pod \"f0f118d4-4773-4b15-9a0a-ad44742e1789\" (UID: \"f0f118d4-4773-4b15-9a0a-ad44742e1789\") " Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.201711 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0f118d4-4773-4b15-9a0a-ad44742e1789-kube-api-access-frjzk" (OuterVolumeSpecName: "kube-api-access-frjzk") pod "f0f118d4-4773-4b15-9a0a-ad44742e1789" (UID: "f0f118d4-4773-4b15-9a0a-ad44742e1789"). InnerVolumeSpecName "kube-api-access-frjzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.295961 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frjzk\" (UniqueName: \"kubernetes.io/projected/f0f118d4-4773-4b15-9a0a-ad44742e1789-kube-api-access-frjzk\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.777658 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b3b-account-create-tkcf4" event={"ID":"f0f118d4-4773-4b15-9a0a-ad44742e1789","Type":"ContainerDied","Data":"876d3751ec8b2712050298391332134f49a77a1d345896b5cf76f3df1357173b"} Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.777719 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="876d3751ec8b2712050298391332134f49a77a1d345896b5cf76f3df1357173b" Sep 30 17:33:31 crc kubenswrapper[4778]: I0930 17:33:31.777953 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b3b-account-create-tkcf4" Sep 30 17:33:33 crc kubenswrapper[4778]: I0930 17:33:33.953687 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-79nnj" podUID="d43aaa67-2f2b-4045-80af-3593da87ed64" containerName="ovn-controller" probeResult="failure" output=< Sep 30 17:33:33 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 17:33:33 crc kubenswrapper[4778]: > Sep 30 17:33:38 crc kubenswrapper[4778]: I0930 17:33:38.971243 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-79nnj" podUID="d43aaa67-2f2b-4045-80af-3593da87ed64" containerName="ovn-controller" probeResult="failure" output=< Sep 30 17:33:38 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 17:33:38 crc kubenswrapper[4778]: > Sep 30 17:33:40 crc kubenswrapper[4778]: I0930 17:33:40.473166 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.364059 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-dda9-account-create-dnfpr"] Sep 30 17:33:43 crc kubenswrapper[4778]: E0930 17:33:43.365259 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0f118d4-4773-4b15-9a0a-ad44742e1789" containerName="mariadb-account-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.365281 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0f118d4-4773-4b15-9a0a-ad44742e1789" containerName="mariadb-account-create" Sep 30 17:33:43 crc kubenswrapper[4778]: E0930 17:33:43.365311 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd" containerName="mariadb-account-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.365321 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd" containerName="mariadb-account-create" Sep 30 17:33:43 crc kubenswrapper[4778]: E0930 17:33:43.365348 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cae8b89-9230-48a1-9cdd-c604911a37f8" containerName="mariadb-database-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.365360 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cae8b89-9230-48a1-9cdd-c604911a37f8" containerName="mariadb-database-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.365675 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd" containerName="mariadb-account-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.365705 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cae8b89-9230-48a1-9cdd-c604911a37f8" containerName="mariadb-database-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.365721 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0f118d4-4773-4b15-9a0a-ad44742e1789" containerName="mariadb-account-create" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.366390 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.369132 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.392527 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-dda9-account-create-dnfpr"] Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.410577 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq9rb\" (UniqueName: \"kubernetes.io/projected/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d-kube-api-access-wq9rb\") pod \"glance-dda9-account-create-dnfpr\" (UID: \"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d\") " pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.513706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq9rb\" (UniqueName: \"kubernetes.io/projected/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d-kube-api-access-wq9rb\") pod \"glance-dda9-account-create-dnfpr\" (UID: \"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d\") " pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.534333 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq9rb\" (UniqueName: \"kubernetes.io/projected/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d-kube-api-access-wq9rb\") pod \"glance-dda9-account-create-dnfpr\" (UID: \"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d\") " pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.700170 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.944314 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-79nnj" podUID="d43aaa67-2f2b-4045-80af-3593da87ed64" containerName="ovn-controller" probeResult="failure" output=< Sep 30 17:33:43 crc kubenswrapper[4778]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 17:33:43 crc kubenswrapper[4778]: > Sep 30 17:33:43 crc kubenswrapper[4778]: I0930 17:33:43.999401 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.000886 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-qd6f4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.142232 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-dda9-account-create-dnfpr"] Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.230643 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-79nnj-config-s8tn4"] Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.231670 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.234602 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.240786 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79nnj-config-s8tn4"] Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.326424 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-additional-scripts\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.326775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-log-ovn\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.326822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.326929 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-scripts\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.326981 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run-ovn\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.327106 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnbvq\" (UniqueName: \"kubernetes.io/projected/e71e3d75-1d61-4c1b-ad85-f7677a471718-kube-api-access-fnbvq\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.428588 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.428687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-scripts\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.428711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run-ovn\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.428756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnbvq\" (UniqueName: \"kubernetes.io/projected/e71e3d75-1d61-4c1b-ad85-f7677a471718-kube-api-access-fnbvq\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.428820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-additional-scripts\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.428849 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-log-ovn\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.429021 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run-ovn\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.429041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-log-ovn\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.429041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.429515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-additional-scripts\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.430554 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-scripts\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.452516 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnbvq\" (UniqueName: \"kubernetes.io/projected/e71e3d75-1d61-4c1b-ad85-f7677a471718-kube-api-access-fnbvq\") pod \"ovn-controller-79nnj-config-s8tn4\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.545105 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.702777 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.898580 4778 generic.go:334] "Generic (PLEG): container finished" podID="7b9285b1-4e8e-4600-9dbb-1b0bf16a742d" containerID="306c584e17e3861c365bcb461e949b8422b905692509357c61a7828cfb7505e3" exitCode=0 Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.898744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dda9-account-create-dnfpr" event={"ID":"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d","Type":"ContainerDied","Data":"306c584e17e3861c365bcb461e949b8422b905692509357c61a7828cfb7505e3"} Sep 30 17:33:44 crc kubenswrapper[4778]: I0930 17:33:44.898800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dda9-account-create-dnfpr" event={"ID":"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d","Type":"ContainerStarted","Data":"a179144dccf9ace371e2936a71b8fd21f7a5c4ed799bb730bf3a29413dce8500"} Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.024100 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-tgmpn"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.025234 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.025344 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.053464 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tgmpn"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.147050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxz24\" (UniqueName: \"kubernetes.io/projected/90afd016-0994-4d25-ae04-1609f9b811b0-kube-api-access-bxz24\") pod \"cinder-db-create-tgmpn\" (UID: \"90afd016-0994-4d25-ae04-1609f9b811b0\") " pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.155984 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79nnj-config-s8tn4"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.183393 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2t8zc"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.188793 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.207132 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2t8zc"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.251229 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2z7n\" (UniqueName: \"kubernetes.io/projected/d17e6a20-e807-4f0b-8a17-551f2c547ae5-kube-api-access-m2z7n\") pod \"neutron-db-create-2t8zc\" (UID: \"d17e6a20-e807-4f0b-8a17-551f2c547ae5\") " pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.251575 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxz24\" (UniqueName: \"kubernetes.io/projected/90afd016-0994-4d25-ae04-1609f9b811b0-kube-api-access-bxz24\") pod \"cinder-db-create-tgmpn\" (UID: \"90afd016-0994-4d25-ae04-1609f9b811b0\") " pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.278293 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxz24\" (UniqueName: \"kubernetes.io/projected/90afd016-0994-4d25-ae04-1609f9b811b0-kube-api-access-bxz24\") pod \"cinder-db-create-tgmpn\" (UID: \"90afd016-0994-4d25-ae04-1609f9b811b0\") " pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.352825 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2z7n\" (UniqueName: \"kubernetes.io/projected/d17e6a20-e807-4f0b-8a17-551f2c547ae5-kube-api-access-m2z7n\") pod \"neutron-db-create-2t8zc\" (UID: \"d17e6a20-e807-4f0b-8a17-551f2c547ae5\") " pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.361009 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.369745 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2z7n\" (UniqueName: \"kubernetes.io/projected/d17e6a20-e807-4f0b-8a17-551f2c547ae5-kube-api-access-m2z7n\") pod \"neutron-db-create-2t8zc\" (UID: \"d17e6a20-e807-4f0b-8a17-551f2c547ae5\") " pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.445427 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vf8bf"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.449210 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.453033 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.455393 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.455435 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.455563 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zmxkr" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.460363 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vf8bf"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.541438 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.557883 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l726m\" (UniqueName: \"kubernetes.io/projected/e327b2a5-293e-472e-a1e8-e25f01d54232-kube-api-access-l726m\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.557948 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-combined-ca-bundle\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.558105 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-config-data\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.661518 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-combined-ca-bundle\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.661595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-config-data\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.661684 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l726m\" (UniqueName: \"kubernetes.io/projected/e327b2a5-293e-472e-a1e8-e25f01d54232-kube-api-access-l726m\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.675665 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-config-data\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.679245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-combined-ca-bundle\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.687145 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l726m\" (UniqueName: \"kubernetes.io/projected/e327b2a5-293e-472e-a1e8-e25f01d54232-kube-api-access-l726m\") pod \"keystone-db-sync-vf8bf\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.779101 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.907634 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-tgmpn"] Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.926665 4778 generic.go:334] "Generic (PLEG): container finished" podID="e71e3d75-1d61-4c1b-ad85-f7677a471718" containerID="b7b9816c46f54a3b0a286e5f7c0f0f356e4366bd434c026da3a48a1b37314b27" exitCode=0 Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.927133 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj-config-s8tn4" event={"ID":"e71e3d75-1d61-4c1b-ad85-f7677a471718","Type":"ContainerDied","Data":"b7b9816c46f54a3b0a286e5f7c0f0f356e4366bd434c026da3a48a1b37314b27"} Sep 30 17:33:45 crc kubenswrapper[4778]: I0930 17:33:45.927156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj-config-s8tn4" event={"ID":"e71e3d75-1d61-4c1b-ad85-f7677a471718","Type":"ContainerStarted","Data":"8998a52c5442d45f2b1242ce59b3bf7afc5ad8432908f7d8e8b5a0a8a8459f42"} Sep 30 17:33:45 crc kubenswrapper[4778]: W0930 17:33:45.927697 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90afd016_0994_4d25_ae04_1609f9b811b0.slice/crio-e5387a46003ccc34fd21a3bfe41557599c270cd84ceb1c3564c6f661f73772f9 WatchSource:0}: Error finding container e5387a46003ccc34fd21a3bfe41557599c270cd84ceb1c3564c6f661f73772f9: Status 404 returned error can't find the container with id e5387a46003ccc34fd21a3bfe41557599c270cd84ceb1c3564c6f661f73772f9 Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.035999 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2t8zc"] Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.215159 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vf8bf"] Sep 30 17:33:46 crc kubenswrapper[4778]: W0930 17:33:46.321555 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode327b2a5_293e_472e_a1e8_e25f01d54232.slice/crio-2ff88b25225cbbfa3cf9fc619d23e5d5b05e0338afc026a07d86653ebfd24a50 WatchSource:0}: Error finding container 2ff88b25225cbbfa3cf9fc619d23e5d5b05e0338afc026a07d86653ebfd24a50: Status 404 returned error can't find the container with id 2ff88b25225cbbfa3cf9fc619d23e5d5b05e0338afc026a07d86653ebfd24a50 Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.323519 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.373121 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq9rb\" (UniqueName: \"kubernetes.io/projected/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d-kube-api-access-wq9rb\") pod \"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d\" (UID: \"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d\") " Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.381220 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d-kube-api-access-wq9rb" (OuterVolumeSpecName: "kube-api-access-wq9rb") pod "7b9285b1-4e8e-4600-9dbb-1b0bf16a742d" (UID: "7b9285b1-4e8e-4600-9dbb-1b0bf16a742d"). InnerVolumeSpecName "kube-api-access-wq9rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.475190 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq9rb\" (UniqueName: \"kubernetes.io/projected/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d-kube-api-access-wq9rb\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.939592 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-dda9-account-create-dnfpr" event={"ID":"7b9285b1-4e8e-4600-9dbb-1b0bf16a742d","Type":"ContainerDied","Data":"a179144dccf9ace371e2936a71b8fd21f7a5c4ed799bb730bf3a29413dce8500"} Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.939983 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a179144dccf9ace371e2936a71b8fd21f7a5c4ed799bb730bf3a29413dce8500" Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.939655 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-dda9-account-create-dnfpr" Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.941255 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vf8bf" event={"ID":"e327b2a5-293e-472e-a1e8-e25f01d54232","Type":"ContainerStarted","Data":"2ff88b25225cbbfa3cf9fc619d23e5d5b05e0338afc026a07d86653ebfd24a50"} Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.943093 4778 generic.go:334] "Generic (PLEG): container finished" podID="d17e6a20-e807-4f0b-8a17-551f2c547ae5" containerID="4fa95329fed631df5f5d87fd9e237641382877d7dee074994d1a3e36e010bde5" exitCode=0 Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.943137 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2t8zc" event={"ID":"d17e6a20-e807-4f0b-8a17-551f2c547ae5","Type":"ContainerDied","Data":"4fa95329fed631df5f5d87fd9e237641382877d7dee074994d1a3e36e010bde5"} Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.943154 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2t8zc" event={"ID":"d17e6a20-e807-4f0b-8a17-551f2c547ae5","Type":"ContainerStarted","Data":"0ff520ae55728773e564db2e36c7100423a160efaa1be06d0fa3ba31f9e2e571"} Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.946956 4778 generic.go:334] "Generic (PLEG): container finished" podID="90afd016-0994-4d25-ae04-1609f9b811b0" containerID="6b8c982a5f71a8b2b431ce0aa74b74196e33f6b4908a7617b910d37c683f803a" exitCode=0 Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.947007 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tgmpn" event={"ID":"90afd016-0994-4d25-ae04-1609f9b811b0","Type":"ContainerDied","Data":"6b8c982a5f71a8b2b431ce0aa74b74196e33f6b4908a7617b910d37c683f803a"} Sep 30 17:33:46 crc kubenswrapper[4778]: I0930 17:33:46.947054 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tgmpn" event={"ID":"90afd016-0994-4d25-ae04-1609f9b811b0","Type":"ContainerStarted","Data":"e5387a46003ccc34fd21a3bfe41557599c270cd84ceb1c3564c6f661f73772f9"} Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.301558 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.388021 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-scripts\") pod \"e71e3d75-1d61-4c1b-ad85-f7677a471718\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.388114 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-additional-scripts\") pod \"e71e3d75-1d61-4c1b-ad85-f7677a471718\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.388148 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run\") pod \"e71e3d75-1d61-4c1b-ad85-f7677a471718\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.388183 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-log-ovn\") pod \"e71e3d75-1d61-4c1b-ad85-f7677a471718\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.388252 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run-ovn\") pod \"e71e3d75-1d61-4c1b-ad85-f7677a471718\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.388302 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnbvq\" (UniqueName: \"kubernetes.io/projected/e71e3d75-1d61-4c1b-ad85-f7677a471718-kube-api-access-fnbvq\") pod \"e71e3d75-1d61-4c1b-ad85-f7677a471718\" (UID: \"e71e3d75-1d61-4c1b-ad85-f7677a471718\") " Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.389306 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run" (OuterVolumeSpecName: "var-run") pod "e71e3d75-1d61-4c1b-ad85-f7677a471718" (UID: "e71e3d75-1d61-4c1b-ad85-f7677a471718"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.389604 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e71e3d75-1d61-4c1b-ad85-f7677a471718" (UID: "e71e3d75-1d61-4c1b-ad85-f7677a471718"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.389686 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e71e3d75-1d61-4c1b-ad85-f7677a471718" (UID: "e71e3d75-1d61-4c1b-ad85-f7677a471718"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.390043 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e71e3d75-1d61-4c1b-ad85-f7677a471718" (UID: "e71e3d75-1d61-4c1b-ad85-f7677a471718"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.390190 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-scripts" (OuterVolumeSpecName: "scripts") pod "e71e3d75-1d61-4c1b-ad85-f7677a471718" (UID: "e71e3d75-1d61-4c1b-ad85-f7677a471718"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.406741 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71e3d75-1d61-4c1b-ad85-f7677a471718-kube-api-access-fnbvq" (OuterVolumeSpecName: "kube-api-access-fnbvq") pod "e71e3d75-1d61-4c1b-ad85-f7677a471718" (UID: "e71e3d75-1d61-4c1b-ad85-f7677a471718"). InnerVolumeSpecName "kube-api-access-fnbvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.490760 4778 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.490811 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnbvq\" (UniqueName: \"kubernetes.io/projected/e71e3d75-1d61-4c1b-ad85-f7677a471718-kube-api-access-fnbvq\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.490824 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.490832 4778 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e71e3d75-1d61-4c1b-ad85-f7677a471718-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.490842 4778 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.490850 4778 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e71e3d75-1d61-4c1b-ad85-f7677a471718-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.959284 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-s8tn4" Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.959849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj-config-s8tn4" event={"ID":"e71e3d75-1d61-4c1b-ad85-f7677a471718","Type":"ContainerDied","Data":"8998a52c5442d45f2b1242ce59b3bf7afc5ad8432908f7d8e8b5a0a8a8459f42"} Sep 30 17:33:47 crc kubenswrapper[4778]: I0930 17:33:47.959872 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8998a52c5442d45f2b1242ce59b3bf7afc5ad8432908f7d8e8b5a0a8a8459f42" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.397485 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-79nnj-config-s8tn4"] Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.411662 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-79nnj-config-s8tn4"] Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.505938 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-79nnj-config-npbf9"] Sep 30 17:33:48 crc kubenswrapper[4778]: E0930 17:33:48.506409 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b9285b1-4e8e-4600-9dbb-1b0bf16a742d" containerName="mariadb-account-create" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.506426 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b9285b1-4e8e-4600-9dbb-1b0bf16a742d" containerName="mariadb-account-create" Sep 30 17:33:48 crc kubenswrapper[4778]: E0930 17:33:48.506448 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71e3d75-1d61-4c1b-ad85-f7677a471718" containerName="ovn-config" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.506455 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71e3d75-1d61-4c1b-ad85-f7677a471718" containerName="ovn-config" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.506673 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b9285b1-4e8e-4600-9dbb-1b0bf16a742d" containerName="mariadb-account-create" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.506698 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71e3d75-1d61-4c1b-ad85-f7677a471718" containerName="ovn-config" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.507348 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.514653 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79nnj-config-npbf9"] Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.521595 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.582763 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pk9vz"] Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.583973 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.590436 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.590440 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v8k48" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.603231 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pk9vz"] Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613279 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-combined-ca-bundle\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613341 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-config-data\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613362 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613689 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-additional-scripts\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613733 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-db-sync-config-data\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613808 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/e0280205-5cc0-45e5-8f2d-926b300cb348-kube-api-access-swgrm\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613826 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-scripts\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run-ovn\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613876 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-log-ovn\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.613906 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v56z2\" (UniqueName: \"kubernetes.io/projected/6655b6df-f160-4521-b749-dc2a42c8bbff-kube-api-access-v56z2\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715760 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/e0280205-5cc0-45e5-8f2d-926b300cb348-kube-api-access-swgrm\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715810 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-scripts\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run-ovn\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715868 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-log-ovn\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715901 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v56z2\" (UniqueName: \"kubernetes.io/projected/6655b6df-f160-4521-b749-dc2a42c8bbff-kube-api-access-v56z2\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-combined-ca-bundle\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715982 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-config-data\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.715999 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.716037 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-additional-scripts\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.716060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-db-sync-config-data\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.716252 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-log-ovn\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.716456 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run-ovn\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.716606 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.718358 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-additional-scripts\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.719094 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-scripts\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.723388 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-config-data\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.724383 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-combined-ca-bundle\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.725385 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-db-sync-config-data\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.742034 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v56z2\" (UniqueName: \"kubernetes.io/projected/6655b6df-f160-4521-b749-dc2a42c8bbff-kube-api-access-v56z2\") pod \"ovn-controller-79nnj-config-npbf9\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.745327 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/e0280205-5cc0-45e5-8f2d-926b300cb348-kube-api-access-swgrm\") pod \"glance-db-sync-pk9vz\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.837082 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.914938 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pk9vz" Sep 30 17:33:48 crc kubenswrapper[4778]: I0930 17:33:48.949857 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-79nnj" Sep 30 17:33:49 crc kubenswrapper[4778]: I0930 17:33:49.724409 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71e3d75-1d61-4c1b-ad85-f7677a471718" path="/var/lib/kubelet/pods/e71e3d75-1d61-4c1b-ad85-f7677a471718/volumes" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.007368 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2t8zc" event={"ID":"d17e6a20-e807-4f0b-8a17-551f2c547ae5","Type":"ContainerDied","Data":"0ff520ae55728773e564db2e36c7100423a160efaa1be06d0fa3ba31f9e2e571"} Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.007802 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff520ae55728773e564db2e36c7100423a160efaa1be06d0fa3ba31f9e2e571" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.012559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-tgmpn" event={"ID":"90afd016-0994-4d25-ae04-1609f9b811b0","Type":"ContainerDied","Data":"e5387a46003ccc34fd21a3bfe41557599c270cd84ceb1c3564c6f661f73772f9"} Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.012852 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5387a46003ccc34fd21a3bfe41557599c270cd84ceb1c3564c6f661f73772f9" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.014734 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.027937 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.168602 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxz24\" (UniqueName: \"kubernetes.io/projected/90afd016-0994-4d25-ae04-1609f9b811b0-kube-api-access-bxz24\") pod \"90afd016-0994-4d25-ae04-1609f9b811b0\" (UID: \"90afd016-0994-4d25-ae04-1609f9b811b0\") " Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.168978 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2z7n\" (UniqueName: \"kubernetes.io/projected/d17e6a20-e807-4f0b-8a17-551f2c547ae5-kube-api-access-m2z7n\") pod \"d17e6a20-e807-4f0b-8a17-551f2c547ae5\" (UID: \"d17e6a20-e807-4f0b-8a17-551f2c547ae5\") " Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.176434 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90afd016-0994-4d25-ae04-1609f9b811b0-kube-api-access-bxz24" (OuterVolumeSpecName: "kube-api-access-bxz24") pod "90afd016-0994-4d25-ae04-1609f9b811b0" (UID: "90afd016-0994-4d25-ae04-1609f9b811b0"). InnerVolumeSpecName "kube-api-access-bxz24". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.178571 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17e6a20-e807-4f0b-8a17-551f2c547ae5-kube-api-access-m2z7n" (OuterVolumeSpecName: "kube-api-access-m2z7n") pod "d17e6a20-e807-4f0b-8a17-551f2c547ae5" (UID: "d17e6a20-e807-4f0b-8a17-551f2c547ae5"). InnerVolumeSpecName "kube-api-access-m2z7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.271283 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2z7n\" (UniqueName: \"kubernetes.io/projected/d17e6a20-e807-4f0b-8a17-551f2c547ae5-kube-api-access-m2z7n\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.271333 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxz24\" (UniqueName: \"kubernetes.io/projected/90afd016-0994-4d25-ae04-1609f9b811b0-kube-api-access-bxz24\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.377273 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-79nnj-config-npbf9"] Sep 30 17:33:51 crc kubenswrapper[4778]: I0930 17:33:51.482091 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pk9vz"] Sep 30 17:33:51 crc kubenswrapper[4778]: W0930 17:33:51.485387 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0280205_5cc0_45e5_8f2d_926b300cb348.slice/crio-7a365050019582519fbdcc016e44af1c1bdc1e86431f3e6101cf0c3c5d0be8fc WatchSource:0}: Error finding container 7a365050019582519fbdcc016e44af1c1bdc1e86431f3e6101cf0c3c5d0be8fc: Status 404 returned error can't find the container with id 7a365050019582519fbdcc016e44af1c1bdc1e86431f3e6101cf0c3c5d0be8fc Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.023140 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vf8bf" event={"ID":"e327b2a5-293e-472e-a1e8-e25f01d54232","Type":"ContainerStarted","Data":"c4f55799c611a9e25ba2d7b495b5641106b4dcdc0d1b67e66ec29cf003ce0155"} Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.025509 4778 generic.go:334] "Generic (PLEG): container finished" podID="6655b6df-f160-4521-b749-dc2a42c8bbff" containerID="fc1237327068b1110e5968051edbe956cc9e3480ea6c35718ae2d5ebf2c0a89d" exitCode=0 Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.025564 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj-config-npbf9" event={"ID":"6655b6df-f160-4521-b749-dc2a42c8bbff","Type":"ContainerDied","Data":"fc1237327068b1110e5968051edbe956cc9e3480ea6c35718ae2d5ebf2c0a89d"} Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.025581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj-config-npbf9" event={"ID":"6655b6df-f160-4521-b749-dc2a42c8bbff","Type":"ContainerStarted","Data":"b2fcdbd053d7ef8bd52e241e7754ca2382fa762de8a99db553f6b5a27ad3fdb4"} Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.027998 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2t8zc" Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.028062 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pk9vz" event={"ID":"e0280205-5cc0-45e5-8f2d-926b300cb348","Type":"ContainerStarted","Data":"7a365050019582519fbdcc016e44af1c1bdc1e86431f3e6101cf0c3c5d0be8fc"} Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.028218 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-tgmpn" Sep 30 17:33:52 crc kubenswrapper[4778]: I0930 17:33:52.048201 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vf8bf" podStartSLOduration=2.481546201 podStartE2EDuration="7.048182252s" podCreationTimestamp="2025-09-30 17:33:45 +0000 UTC" firstStartedPulling="2025-09-30 17:33:46.326663218 +0000 UTC m=+965.316561021" lastFinishedPulling="2025-09-30 17:33:50.893299269 +0000 UTC m=+969.883197072" observedRunningTime="2025-09-30 17:33:52.043661994 +0000 UTC m=+971.033559887" watchObservedRunningTime="2025-09-30 17:33:52.048182252 +0000 UTC m=+971.038080055" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.349438 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413446 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run-ovn\") pod \"6655b6df-f160-4521-b749-dc2a42c8bbff\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413548 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-additional-scripts\") pod \"6655b6df-f160-4521-b749-dc2a42c8bbff\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413672 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-scripts\") pod \"6655b6df-f160-4521-b749-dc2a42c8bbff\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413699 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "6655b6df-f160-4521-b749-dc2a42c8bbff" (UID: "6655b6df-f160-4521-b749-dc2a42c8bbff"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413716 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-log-ovn\") pod \"6655b6df-f160-4521-b749-dc2a42c8bbff\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413761 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "6655b6df-f160-4521-b749-dc2a42c8bbff" (UID: "6655b6df-f160-4521-b749-dc2a42c8bbff"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.413858 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v56z2\" (UniqueName: \"kubernetes.io/projected/6655b6df-f160-4521-b749-dc2a42c8bbff-kube-api-access-v56z2\") pod \"6655b6df-f160-4521-b749-dc2a42c8bbff\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414005 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run\") pod \"6655b6df-f160-4521-b749-dc2a42c8bbff\" (UID: \"6655b6df-f160-4521-b749-dc2a42c8bbff\") " Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414100 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run" (OuterVolumeSpecName: "var-run") pod "6655b6df-f160-4521-b749-dc2a42c8bbff" (UID: "6655b6df-f160-4521-b749-dc2a42c8bbff"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414521 4778 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414537 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "6655b6df-f160-4521-b749-dc2a42c8bbff" (UID: "6655b6df-f160-4521-b749-dc2a42c8bbff"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414545 4778 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414594 4778 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6655b6df-f160-4521-b749-dc2a42c8bbff-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.414722 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-scripts" (OuterVolumeSpecName: "scripts") pod "6655b6df-f160-4521-b749-dc2a42c8bbff" (UID: "6655b6df-f160-4521-b749-dc2a42c8bbff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.419748 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6655b6df-f160-4521-b749-dc2a42c8bbff-kube-api-access-v56z2" (OuterVolumeSpecName: "kube-api-access-v56z2") pod "6655b6df-f160-4521-b749-dc2a42c8bbff" (UID: "6655b6df-f160-4521-b749-dc2a42c8bbff"). InnerVolumeSpecName "kube-api-access-v56z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.516661 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.516701 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v56z2\" (UniqueName: \"kubernetes.io/projected/6655b6df-f160-4521-b749-dc2a42c8bbff-kube-api-access-v56z2\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:53 crc kubenswrapper[4778]: I0930 17:33:53.516716 4778 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/6655b6df-f160-4521-b749-dc2a42c8bbff-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.046420 4778 generic.go:334] "Generic (PLEG): container finished" podID="e327b2a5-293e-472e-a1e8-e25f01d54232" containerID="c4f55799c611a9e25ba2d7b495b5641106b4dcdc0d1b67e66ec29cf003ce0155" exitCode=0 Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.046491 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vf8bf" event={"ID":"e327b2a5-293e-472e-a1e8-e25f01d54232","Type":"ContainerDied","Data":"c4f55799c611a9e25ba2d7b495b5641106b4dcdc0d1b67e66ec29cf003ce0155"} Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.048278 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-79nnj-config-npbf9" event={"ID":"6655b6df-f160-4521-b749-dc2a42c8bbff","Type":"ContainerDied","Data":"b2fcdbd053d7ef8bd52e241e7754ca2382fa762de8a99db553f6b5a27ad3fdb4"} Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.048399 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2fcdbd053d7ef8bd52e241e7754ca2382fa762de8a99db553f6b5a27ad3fdb4" Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.048462 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-79nnj-config-npbf9" Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.419569 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-79nnj-config-npbf9"] Sep 30 17:33:54 crc kubenswrapper[4778]: I0930 17:33:54.426145 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-79nnj-config-npbf9"] Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.142127 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8860-account-create-jp7dv"] Sep 30 17:33:55 crc kubenswrapper[4778]: E0930 17:33:55.142830 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17e6a20-e807-4f0b-8a17-551f2c547ae5" containerName="mariadb-database-create" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.142844 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17e6a20-e807-4f0b-8a17-551f2c547ae5" containerName="mariadb-database-create" Sep 30 17:33:55 crc kubenswrapper[4778]: E0930 17:33:55.142864 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90afd016-0994-4d25-ae04-1609f9b811b0" containerName="mariadb-database-create" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.142869 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="90afd016-0994-4d25-ae04-1609f9b811b0" containerName="mariadb-database-create" Sep 30 17:33:55 crc kubenswrapper[4778]: E0930 17:33:55.142884 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6655b6df-f160-4521-b749-dc2a42c8bbff" containerName="ovn-config" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.142890 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6655b6df-f160-4521-b749-dc2a42c8bbff" containerName="ovn-config" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.143041 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17e6a20-e807-4f0b-8a17-551f2c547ae5" containerName="mariadb-database-create" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.143062 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6655b6df-f160-4521-b749-dc2a42c8bbff" containerName="ovn-config" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.143072 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="90afd016-0994-4d25-ae04-1609f9b811b0" containerName="mariadb-database-create" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.143824 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.147234 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.157190 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8860-account-create-jp7dv"] Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.242408 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpp5q\" (UniqueName: \"kubernetes.io/projected/615fbb2f-3311-4f19-9ffa-6c01b802ae13-kube-api-access-dpp5q\") pod \"cinder-8860-account-create-jp7dv\" (UID: \"615fbb2f-3311-4f19-9ffa-6c01b802ae13\") " pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.344676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpp5q\" (UniqueName: \"kubernetes.io/projected/615fbb2f-3311-4f19-9ffa-6c01b802ae13-kube-api-access-dpp5q\") pod \"cinder-8860-account-create-jp7dv\" (UID: \"615fbb2f-3311-4f19-9ffa-6c01b802ae13\") " pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.359329 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.365464 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpp5q\" (UniqueName: \"kubernetes.io/projected/615fbb2f-3311-4f19-9ffa-6c01b802ae13-kube-api-access-dpp5q\") pod \"cinder-8860-account-create-jp7dv\" (UID: \"615fbb2f-3311-4f19-9ffa-6c01b802ae13\") " pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.445565 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-config-data\") pod \"e327b2a5-293e-472e-a1e8-e25f01d54232\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.445744 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l726m\" (UniqueName: \"kubernetes.io/projected/e327b2a5-293e-472e-a1e8-e25f01d54232-kube-api-access-l726m\") pod \"e327b2a5-293e-472e-a1e8-e25f01d54232\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.445773 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-combined-ca-bundle\") pod \"e327b2a5-293e-472e-a1e8-e25f01d54232\" (UID: \"e327b2a5-293e-472e-a1e8-e25f01d54232\") " Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.450079 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e327b2a5-293e-472e-a1e8-e25f01d54232-kube-api-access-l726m" (OuterVolumeSpecName: "kube-api-access-l726m") pod "e327b2a5-293e-472e-a1e8-e25f01d54232" (UID: "e327b2a5-293e-472e-a1e8-e25f01d54232"). InnerVolumeSpecName "kube-api-access-l726m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.466333 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.474764 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e327b2a5-293e-472e-a1e8-e25f01d54232" (UID: "e327b2a5-293e-472e-a1e8-e25f01d54232"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.488944 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-config-data" (OuterVolumeSpecName: "config-data") pod "e327b2a5-293e-472e-a1e8-e25f01d54232" (UID: "e327b2a5-293e-472e-a1e8-e25f01d54232"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.547159 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l726m\" (UniqueName: \"kubernetes.io/projected/e327b2a5-293e-472e-a1e8-e25f01d54232-kube-api-access-l726m\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.547191 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.547204 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e327b2a5-293e-472e-a1e8-e25f01d54232-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.725799 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6655b6df-f160-4521-b749-dc2a42c8bbff" path="/var/lib/kubelet/pods/6655b6df-f160-4521-b749-dc2a42c8bbff/volumes" Sep 30 17:33:55 crc kubenswrapper[4778]: I0930 17:33:55.880660 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8860-account-create-jp7dv"] Sep 30 17:33:55 crc kubenswrapper[4778]: W0930 17:33:55.887567 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod615fbb2f_3311_4f19_9ffa_6c01b802ae13.slice/crio-b0697fa59f8ae67cc8b77d89a32fb0364139f088c595f69d467cfcfcfc500f70 WatchSource:0}: Error finding container b0697fa59f8ae67cc8b77d89a32fb0364139f088c595f69d467cfcfcfc500f70: Status 404 returned error can't find the container with id b0697fa59f8ae67cc8b77d89a32fb0364139f088c595f69d467cfcfcfc500f70 Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.070256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vf8bf" event={"ID":"e327b2a5-293e-472e-a1e8-e25f01d54232","Type":"ContainerDied","Data":"2ff88b25225cbbfa3cf9fc619d23e5d5b05e0338afc026a07d86653ebfd24a50"} Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.070304 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ff88b25225cbbfa3cf9fc619d23e5d5b05e0338afc026a07d86653ebfd24a50" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.070348 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vf8bf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.087447 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8860-account-create-jp7dv" event={"ID":"615fbb2f-3311-4f19-9ffa-6c01b802ae13","Type":"ContainerStarted","Data":"6389067a01abab59ea73d0ec759ad17247ce56cb7707498b0a78930b1e18f000"} Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.087505 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8860-account-create-jp7dv" event={"ID":"615fbb2f-3311-4f19-9ffa-6c01b802ae13","Type":"ContainerStarted","Data":"b0697fa59f8ae67cc8b77d89a32fb0364139f088c595f69d467cfcfcfc500f70"} Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.102520 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8860-account-create-jp7dv" podStartSLOduration=1.102502701 podStartE2EDuration="1.102502701s" podCreationTimestamp="2025-09-30 17:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:33:56.101442738 +0000 UTC m=+975.091340571" watchObservedRunningTime="2025-09-30 17:33:56.102502701 +0000 UTC m=+975.092400504" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.343341 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-tgccd"] Sep 30 17:33:56 crc kubenswrapper[4778]: E0930 17:33:56.344451 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e327b2a5-293e-472e-a1e8-e25f01d54232" containerName="keystone-db-sync" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.344469 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e327b2a5-293e-472e-a1e8-e25f01d54232" containerName="keystone-db-sync" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.344654 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e327b2a5-293e-472e-a1e8-e25f01d54232" containerName="keystone-db-sync" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.345152 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.347439 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.347682 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zmxkr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.347863 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.347970 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.356890 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-hpqcf"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.358294 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365397 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-credential-keys\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365437 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365457 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-config\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365489 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-combined-ca-bundle\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365645 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-config-data\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365746 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-scripts\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365798 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365856 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzprz\" (UniqueName: \"kubernetes.io/projected/561dc04f-4607-420e-bb56-2f8bd70d477e-kube-api-access-rzprz\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.365963 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp27n\" (UniqueName: \"kubernetes.io/projected/adeaa955-499d-477f-baf2-b68c6fa6c4d1-kube-api-access-qp27n\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.366046 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-fernet-keys\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.375110 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tgccd"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.391271 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-hpqcf"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.454907 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67bfb67f5f-wzjct"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.457050 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.462515 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.462739 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.462410 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-sv57s" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.463339 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.463922 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67bfb67f5f-wzjct"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467760 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-combined-ca-bundle\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467823 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-config-data\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467863 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65f393a8-cd69-4aff-b73f-96416be3f843-horizon-secret-key\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467891 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-scripts\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467921 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f393a8-cd69-4aff-b73f-96416be3f843-logs\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467939 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.467969 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzprz\" (UniqueName: \"kubernetes.io/projected/561dc04f-4607-420e-bb56-2f8bd70d477e-kube-api-access-rzprz\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468003 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5kzh\" (UniqueName: \"kubernetes.io/projected/65f393a8-cd69-4aff-b73f-96416be3f843-kube-api-access-d5kzh\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468023 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468050 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp27n\" (UniqueName: \"kubernetes.io/projected/adeaa955-499d-477f-baf2-b68c6fa6c4d1-kube-api-access-qp27n\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468070 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-config-data\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468089 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-scripts\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468119 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-fernet-keys\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468140 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-credential-keys\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468160 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.468178 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-config\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.469282 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-config\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.471106 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.472532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.472860 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-dns-svc\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.475885 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-scripts\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.478875 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-combined-ca-bundle\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.481855 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-credential-keys\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.482804 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-config-data\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.495349 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-fernet-keys\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.509556 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzprz\" (UniqueName: \"kubernetes.io/projected/561dc04f-4607-420e-bb56-2f8bd70d477e-kube-api-access-rzprz\") pod \"dnsmasq-dns-75bb4695fc-hpqcf\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.515584 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp27n\" (UniqueName: \"kubernetes.io/projected/adeaa955-499d-477f-baf2-b68c6fa6c4d1-kube-api-access-qp27n\") pod \"keystone-bootstrap-tgccd\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.569929 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65f393a8-cd69-4aff-b73f-96416be3f843-horizon-secret-key\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.570009 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f393a8-cd69-4aff-b73f-96416be3f843-logs\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.570052 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5kzh\" (UniqueName: \"kubernetes.io/projected/65f393a8-cd69-4aff-b73f-96416be3f843-kube-api-access-d5kzh\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.570085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-config-data\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.570108 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-scripts\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.571204 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-scripts\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.572789 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f393a8-cd69-4aff-b73f-96416be3f843-logs\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.573728 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-config-data\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.576935 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65f393a8-cd69-4aff-b73f-96416be3f843-horizon-secret-key\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.636587 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5kzh\" (UniqueName: \"kubernetes.io/projected/65f393a8-cd69-4aff-b73f-96416be3f843-kube-api-access-d5kzh\") pod \"horizon-67bfb67f5f-wzjct\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.657515 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7888848bff-9rdhr"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.661160 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.671427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-scripts\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.671499 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-config-data\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.671534 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-horizon-secret-key\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.671551 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-logs\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.671671 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgtmm\" (UniqueName: \"kubernetes.io/projected/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-kube-api-access-bgtmm\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.671431 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-hpqcf"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.672406 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.678487 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.684819 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7888848bff-9rdhr"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.712019 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-fmzbx"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.718195 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.726404 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.726630 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.727309 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6c4tb" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.760739 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fmzbx"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.772479 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-horizon-secret-key\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.772602 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/d593f115-5fd5-4db3-9244-8c90696317c4-kube-api-access-lfrxz\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.772706 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-logs\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.772816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-combined-ca-bundle\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.772938 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d593f115-5fd5-4db3-9244-8c90696317c4-logs\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.773124 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-config-data\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.773232 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgtmm\" (UniqueName: \"kubernetes.io/projected/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-kube-api-access-bgtmm\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.773330 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-scripts\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.773433 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-scripts\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.773536 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-config-data\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.775003 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-logs\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.775644 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-scripts\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.776877 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xmpfk"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.778904 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.780427 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-horizon-secret-key\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.783096 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xmpfk"] Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.783711 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-config-data\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.814319 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgtmm\" (UniqueName: \"kubernetes.io/projected/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-kube-api-access-bgtmm\") pod \"horizon-7888848bff-9rdhr\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.877443 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/d593f115-5fd5-4db3-9244-8c90696317c4-kube-api-access-lfrxz\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.877756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-combined-ca-bundle\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.877860 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d593f115-5fd5-4db3-9244-8c90696317c4-logs\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.877943 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-config-data\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.878056 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-scripts\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.884094 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.884871 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d593f115-5fd5-4db3-9244-8c90696317c4-logs\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.891098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-config-data\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.895987 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-scripts\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.897686 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-combined-ca-bundle\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.906669 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/d593f115-5fd5-4db3-9244-8c90696317c4-kube-api-access-lfrxz\") pod \"placement-db-sync-fmzbx\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.977459 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.980487 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.980699 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.980777 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5vkn\" (UniqueName: \"kubernetes.io/projected/353a1d8f-2f85-430e-a2f5-adb617373c35-kube-api-access-k5vkn\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.980831 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:56 crc kubenswrapper[4778]: I0930 17:33:56.980907 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-config\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.056409 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmzbx" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.082872 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-config\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.083959 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.084025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.084070 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5vkn\" (UniqueName: \"kubernetes.io/projected/353a1d8f-2f85-430e-a2f5-adb617373c35-kube-api-access-k5vkn\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.084105 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.084734 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-dns-svc\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.083891 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-config\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.085319 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-sb\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.086432 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-nb\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.103191 4778 generic.go:334] "Generic (PLEG): container finished" podID="615fbb2f-3311-4f19-9ffa-6c01b802ae13" containerID="6389067a01abab59ea73d0ec759ad17247ce56cb7707498b0a78930b1e18f000" exitCode=0 Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.103229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8860-account-create-jp7dv" event={"ID":"615fbb2f-3311-4f19-9ffa-6c01b802ae13","Type":"ContainerDied","Data":"6389067a01abab59ea73d0ec759ad17247ce56cb7707498b0a78930b1e18f000"} Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.103757 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5vkn\" (UniqueName: \"kubernetes.io/projected/353a1d8f-2f85-430e-a2f5-adb617373c35-kube-api-access-k5vkn\") pod \"dnsmasq-dns-745b9ddc8c-xmpfk\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:57 crc kubenswrapper[4778]: I0930 17:33:57.399731 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.606572 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67bfb67f5f-wzjct"] Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.651823 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5cdc899d97-c5xnr"] Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.653348 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.665609 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5cdc899d97-c5xnr"] Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.713470 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e093cbb-cd23-4a15-8159-fa514c992d60-logs\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.713516 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-scripts\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.713544 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr5dh\" (UniqueName: \"kubernetes.io/projected/3e093cbb-cd23-4a15-8159-fa514c992d60-kube-api-access-lr5dh\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.714009 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-config-data\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.714062 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e093cbb-cd23-4a15-8159-fa514c992d60-horizon-secret-key\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.816004 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-config-data\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.816049 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e093cbb-cd23-4a15-8159-fa514c992d60-horizon-secret-key\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.816094 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e093cbb-cd23-4a15-8159-fa514c992d60-logs\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.816111 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-scripts\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.816137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lr5dh\" (UniqueName: \"kubernetes.io/projected/3e093cbb-cd23-4a15-8159-fa514c992d60-kube-api-access-lr5dh\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.818454 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e093cbb-cd23-4a15-8159-fa514c992d60-logs\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.819340 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-scripts\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.820190 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-config-data\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.824383 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e093cbb-cd23-4a15-8159-fa514c992d60-horizon-secret-key\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.832970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lr5dh\" (UniqueName: \"kubernetes.io/projected/3e093cbb-cd23-4a15-8159-fa514c992d60-kube-api-access-lr5dh\") pod \"horizon-5cdc899d97-c5xnr\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:33:58 crc kubenswrapper[4778]: I0930 17:33:58.973013 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:34:04 crc kubenswrapper[4778]: I0930 17:34:04.710833 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:34:04 crc kubenswrapper[4778]: I0930 17:34:04.819036 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpp5q\" (UniqueName: \"kubernetes.io/projected/615fbb2f-3311-4f19-9ffa-6c01b802ae13-kube-api-access-dpp5q\") pod \"615fbb2f-3311-4f19-9ffa-6c01b802ae13\" (UID: \"615fbb2f-3311-4f19-9ffa-6c01b802ae13\") " Sep 30 17:34:04 crc kubenswrapper[4778]: I0930 17:34:04.825059 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615fbb2f-3311-4f19-9ffa-6c01b802ae13-kube-api-access-dpp5q" (OuterVolumeSpecName: "kube-api-access-dpp5q") pod "615fbb2f-3311-4f19-9ffa-6c01b802ae13" (UID: "615fbb2f-3311-4f19-9ffa-6c01b802ae13"). InnerVolumeSpecName "kube-api-access-dpp5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:04 crc kubenswrapper[4778]: I0930 17:34:04.920827 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpp5q\" (UniqueName: \"kubernetes.io/projected/615fbb2f-3311-4f19-9ffa-6c01b802ae13-kube-api-access-dpp5q\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.135511 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7888848bff-9rdhr"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.188579 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67d9465564-vcjl8"] Sep 30 17:34:05 crc kubenswrapper[4778]: E0930 17:34:05.188934 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615fbb2f-3311-4f19-9ffa-6c01b802ae13" containerName="mariadb-account-create" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.188952 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="615fbb2f-3311-4f19-9ffa-6c01b802ae13" containerName="mariadb-account-create" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.189093 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="615fbb2f-3311-4f19-9ffa-6c01b802ae13" containerName="mariadb-account-create" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.189904 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.192593 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.199334 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67d9465564-vcjl8"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.201390 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8860-account-create-jp7dv" event={"ID":"615fbb2f-3311-4f19-9ffa-6c01b802ae13","Type":"ContainerDied","Data":"b0697fa59f8ae67cc8b77d89a32fb0364139f088c595f69d467cfcfcfc500f70"} Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.201418 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0697fa59f8ae67cc8b77d89a32fb0364139f088c595f69d467cfcfcfc500f70" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.201467 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8860-account-create-jp7dv" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.230994 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-tls-certs\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.231036 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48eff09-cd27-48b0-9633-0fd28d43e0a0-logs\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.231077 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-combined-ca-bundle\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.231104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-scripts\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.231161 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb5tj\" (UniqueName: \"kubernetes.io/projected/b48eff09-cd27-48b0-9633-0fd28d43e0a0-kube-api-access-hb5tj\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.231189 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-config-data\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.231228 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-secret-key\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.252747 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cdc899d97-c5xnr"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.264949 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f9d6b5dcb-zrl5s"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.266465 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.292744 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f9d6b5dcb-zrl5s"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.316776 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f16f-account-create-n47bv"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.317825 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.320278 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.331894 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-secret-key\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.331940 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-tls-certs\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.331968 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48eff09-cd27-48b0-9633-0fd28d43e0a0-logs\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.332007 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-combined-ca-bundle\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.332038 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-scripts\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.332065 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb5tj\" (UniqueName: \"kubernetes.io/projected/b48eff09-cd27-48b0-9633-0fd28d43e0a0-kube-api-access-hb5tj\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.332102 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-config-data\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.332918 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f16f-account-create-n47bv"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.334159 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-scripts\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.338198 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-config-data\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.339681 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-tls-certs\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.340394 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-combined-ca-bundle\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.341224 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48eff09-cd27-48b0-9633-0fd28d43e0a0-logs\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.348776 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-secret-key\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.353323 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb5tj\" (UniqueName: \"kubernetes.io/projected/b48eff09-cd27-48b0-9633-0fd28d43e0a0-kube-api-access-hb5tj\") pod \"horizon-67d9465564-vcjl8\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.433661 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgsww\" (UniqueName: \"kubernetes.io/projected/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-kube-api-access-lgsww\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.433956 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-horizon-tls-certs\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.433983 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-logs\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.434005 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-horizon-secret-key\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.434021 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-combined-ca-bundle\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.434044 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-scripts\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.434089 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-config-data\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.434107 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb8ps\" (UniqueName: \"kubernetes.io/projected/4206afc7-5079-41a3-887c-62f6083aa72c-kube-api-access-pb8ps\") pod \"neutron-f16f-account-create-n47bv\" (UID: \"4206afc7-5079-41a3-887c-62f6083aa72c\") " pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.442238 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7888848bff-9rdhr"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.531710 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535527 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgsww\" (UniqueName: \"kubernetes.io/projected/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-kube-api-access-lgsww\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535602 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-horizon-tls-certs\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535676 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-logs\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535702 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-combined-ca-bundle\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535717 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-horizon-secret-key\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535776 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-scripts\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535820 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-config-data\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.535841 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb8ps\" (UniqueName: \"kubernetes.io/projected/4206afc7-5079-41a3-887c-62f6083aa72c-kube-api-access-pb8ps\") pod \"neutron-f16f-account-create-n47bv\" (UID: \"4206afc7-5079-41a3-887c-62f6083aa72c\") " pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.538111 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-scripts\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.538456 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-logs\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.540652 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-horizon-tls-certs\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.553398 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-combined-ca-bundle\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.553985 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-config-data\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.554026 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-horizon-secret-key\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.554350 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb8ps\" (UniqueName: \"kubernetes.io/projected/4206afc7-5079-41a3-887c-62f6083aa72c-kube-api-access-pb8ps\") pod \"neutron-f16f-account-create-n47bv\" (UID: \"4206afc7-5079-41a3-887c-62f6083aa72c\") " pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.561239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgsww\" (UniqueName: \"kubernetes.io/projected/fcc69ce6-ef1c-42fc-a94e-daed766ec91d-kube-api-access-lgsww\") pod \"horizon-f9d6b5dcb-zrl5s\" (UID: \"fcc69ce6-ef1c-42fc-a94e-daed766ec91d\") " pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.606380 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.608420 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67bfb67f5f-wzjct"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.614498 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-tgccd"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.635956 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.663111 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-fmzbx"] Sep 30 17:34:05 crc kubenswrapper[4778]: W0930 17:34:05.668332 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f393a8_cd69_4aff_b73f_96416be3f843.slice/crio-5c7531fe62eb0d906f02a24c8709aa176577774b3fc0ad7e81d00cf31399316d WatchSource:0}: Error finding container 5c7531fe62eb0d906f02a24c8709aa176577774b3fc0ad7e81d00cf31399316d: Status 404 returned error can't find the container with id 5c7531fe62eb0d906f02a24c8709aa176577774b3fc0ad7e81d00cf31399316d Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.668838 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cdc899d97-c5xnr"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.677096 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xmpfk"] Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.690169 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-hpqcf"] Sep 30 17:34:05 crc kubenswrapper[4778]: W0930 17:34:05.690523 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e093cbb_cd23_4a15_8159_fa514c992d60.slice/crio-b2323e3fd40f80dd2ac26428faf5687504c197caa5027b511acfa813a96f2ef6 WatchSource:0}: Error finding container b2323e3fd40f80dd2ac26428faf5687504c197caa5027b511acfa813a96f2ef6: Status 404 returned error can't find the container with id b2323e3fd40f80dd2ac26428faf5687504c197caa5027b511acfa813a96f2ef6 Sep 30 17:34:05 crc kubenswrapper[4778]: I0930 17:34:05.997051 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67d9465564-vcjl8"] Sep 30 17:34:06 crc kubenswrapper[4778]: W0930 17:34:06.094065 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb48eff09_cd27_48b0_9633_0fd28d43e0a0.slice/crio-ad6ee72acbe69126b6461218bc3d0444dd96a1d9ce67fca6c94b44df90a08d83 WatchSource:0}: Error finding container ad6ee72acbe69126b6461218bc3d0444dd96a1d9ce67fca6c94b44df90a08d83: Status 404 returned error can't find the container with id ad6ee72acbe69126b6461218bc3d0444dd96a1d9ce67fca6c94b44df90a08d83 Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.140810 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f9d6b5dcb-zrl5s"] Sep 30 17:34:06 crc kubenswrapper[4778]: W0930 17:34:06.144247 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcc69ce6_ef1c_42fc_a94e_daed766ec91d.slice/crio-f72cff59e1f429ab1de4fe95d1a819b5dd764ab596896a208f20ad88462970d5 WatchSource:0}: Error finding container f72cff59e1f429ab1de4fe95d1a819b5dd764ab596896a208f20ad88462970d5: Status 404 returned error can't find the container with id f72cff59e1f429ab1de4fe95d1a819b5dd764ab596896a208f20ad88462970d5 Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.210540 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d9465564-vcjl8" event={"ID":"b48eff09-cd27-48b0-9633-0fd28d43e0a0","Type":"ContainerStarted","Data":"ad6ee72acbe69126b6461218bc3d0444dd96a1d9ce67fca6c94b44df90a08d83"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.211803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cdc899d97-c5xnr" event={"ID":"3e093cbb-cd23-4a15-8159-fa514c992d60","Type":"ContainerStarted","Data":"b2323e3fd40f80dd2ac26428faf5687504c197caa5027b511acfa813a96f2ef6"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.213384 4778 generic.go:334] "Generic (PLEG): container finished" podID="561dc04f-4607-420e-bb56-2f8bd70d477e" containerID="1ffee447f26490f8fe61759a48bd66c50d4560164b92e2197009a41f67728e66" exitCode=0 Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.213424 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" event={"ID":"561dc04f-4607-420e-bb56-2f8bd70d477e","Type":"ContainerDied","Data":"1ffee447f26490f8fe61759a48bd66c50d4560164b92e2197009a41f67728e66"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.213440 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" event={"ID":"561dc04f-4607-420e-bb56-2f8bd70d477e","Type":"ContainerStarted","Data":"dc696438079bbc85a1160b85aa6ceb8b1a39f6f3c01e52b5b0874a844ded52c3"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.215566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bfb67f5f-wzjct" event={"ID":"65f393a8-cd69-4aff-b73f-96416be3f843","Type":"ContainerStarted","Data":"5c7531fe62eb0d906f02a24c8709aa176577774b3fc0ad7e81d00cf31399316d"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.223885 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d6b5dcb-zrl5s" event={"ID":"fcc69ce6-ef1c-42fc-a94e-daed766ec91d","Type":"ContainerStarted","Data":"f72cff59e1f429ab1de4fe95d1a819b5dd764ab596896a208f20ad88462970d5"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.227049 4778 generic.go:334] "Generic (PLEG): container finished" podID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerID="31b52749bbcf816877d77943f54cb53999c37d5e934da317b7876e652bbb4e3b" exitCode=0 Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.227100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" event={"ID":"353a1d8f-2f85-430e-a2f5-adb617373c35","Type":"ContainerDied","Data":"31b52749bbcf816877d77943f54cb53999c37d5e934da317b7876e652bbb4e3b"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.227114 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" event={"ID":"353a1d8f-2f85-430e-a2f5-adb617373c35","Type":"ContainerStarted","Data":"f3ba683ed54c41e641fa3c19e29843300dcba2ed9c80b515d712256fe278beb7"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.228197 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmzbx" event={"ID":"d593f115-5fd5-4db3-9244-8c90696317c4","Type":"ContainerStarted","Data":"1a62f136a07a5e856880b9924d789b95f6811656aabc23fae29f1acec8550cf6"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.242779 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tgccd" event={"ID":"adeaa955-499d-477f-baf2-b68c6fa6c4d1","Type":"ContainerStarted","Data":"d7fa2809b29ef0fb7beb23c0ca2cc0508bca1d95a64f78a90519afaf4e3c1a3b"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.242850 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tgccd" event={"ID":"adeaa955-499d-477f-baf2-b68c6fa6c4d1","Type":"ContainerStarted","Data":"48f966fd9d6256a7980a9f68a08c264139173c4a14751d2ea59fe60677c0af1c"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.249366 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pk9vz" event={"ID":"e0280205-5cc0-45e5-8f2d-926b300cb348","Type":"ContainerStarted","Data":"22808c1658a4e26f82cff60e782fee9587b2047e51b29ff6a19d97a701a24271"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.254276 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7888848bff-9rdhr" event={"ID":"6ff8b4ac-2295-4913-8aa5-f0fb158b7521","Type":"ContainerStarted","Data":"7fc8da9c7d3fda25aa37fce9a9a33d52c6abfd94c947ea1c9f6c41056457fbb1"} Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.272707 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f16f-account-create-n47bv"] Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.282391 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-tgccd" podStartSLOduration=10.282376469 podStartE2EDuration="10.282376469s" podCreationTimestamp="2025-09-30 17:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:06.264891106 +0000 UTC m=+985.254788909" watchObservedRunningTime="2025-09-30 17:34:06.282376469 +0000 UTC m=+985.272274272" Sep 30 17:34:06 crc kubenswrapper[4778]: W0930 17:34:06.288793 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4206afc7_5079_41a3_887c_62f6083aa72c.slice/crio-110343f20e73c5dad135f630dda51b46ec0012f94087ec06d9a7fb8ee0d9bb9b WatchSource:0}: Error finding container 110343f20e73c5dad135f630dda51b46ec0012f94087ec06d9a7fb8ee0d9bb9b: Status 404 returned error can't find the container with id 110343f20e73c5dad135f630dda51b46ec0012f94087ec06d9a7fb8ee0d9bb9b Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.289146 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pk9vz" podStartSLOduration=4.938488499 podStartE2EDuration="18.289135495s" podCreationTimestamp="2025-09-30 17:33:48 +0000 UTC" firstStartedPulling="2025-09-30 17:33:51.490164038 +0000 UTC m=+970.480061851" lastFinishedPulling="2025-09-30 17:34:04.840811054 +0000 UTC m=+983.830708847" observedRunningTime="2025-09-30 17:34:06.280690717 +0000 UTC m=+985.270588520" watchObservedRunningTime="2025-09-30 17:34:06.289135495 +0000 UTC m=+985.279033298" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.669537 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.762516 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-config\") pod \"561dc04f-4607-420e-bb56-2f8bd70d477e\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.762812 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-dns-svc\") pod \"561dc04f-4607-420e-bb56-2f8bd70d477e\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.762854 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-sb\") pod \"561dc04f-4607-420e-bb56-2f8bd70d477e\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.762887 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-nb\") pod \"561dc04f-4607-420e-bb56-2f8bd70d477e\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.763026 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzprz\" (UniqueName: \"kubernetes.io/projected/561dc04f-4607-420e-bb56-2f8bd70d477e-kube-api-access-rzprz\") pod \"561dc04f-4607-420e-bb56-2f8bd70d477e\" (UID: \"561dc04f-4607-420e-bb56-2f8bd70d477e\") " Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.790742 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "561dc04f-4607-420e-bb56-2f8bd70d477e" (UID: "561dc04f-4607-420e-bb56-2f8bd70d477e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.794065 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "561dc04f-4607-420e-bb56-2f8bd70d477e" (UID: "561dc04f-4607-420e-bb56-2f8bd70d477e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.794814 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561dc04f-4607-420e-bb56-2f8bd70d477e-kube-api-access-rzprz" (OuterVolumeSpecName: "kube-api-access-rzprz") pod "561dc04f-4607-420e-bb56-2f8bd70d477e" (UID: "561dc04f-4607-420e-bb56-2f8bd70d477e"). InnerVolumeSpecName "kube-api-access-rzprz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.814150 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-config" (OuterVolumeSpecName: "config") pod "561dc04f-4607-420e-bb56-2f8bd70d477e" (UID: "561dc04f-4607-420e-bb56-2f8bd70d477e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.831559 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "561dc04f-4607-420e-bb56-2f8bd70d477e" (UID: "561dc04f-4607-420e-bb56-2f8bd70d477e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.865925 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.865992 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.866007 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.866018 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzprz\" (UniqueName: \"kubernetes.io/projected/561dc04f-4607-420e-bb56-2f8bd70d477e-kube-api-access-rzprz\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:06 crc kubenswrapper[4778]: I0930 17:34:06.866031 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/561dc04f-4607-420e-bb56-2f8bd70d477e-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.289127 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" event={"ID":"561dc04f-4607-420e-bb56-2f8bd70d477e","Type":"ContainerDied","Data":"dc696438079bbc85a1160b85aa6ceb8b1a39f6f3c01e52b5b0874a844ded52c3"} Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.289155 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb4695fc-hpqcf" Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.289204 4778 scope.go:117] "RemoveContainer" containerID="1ffee447f26490f8fe61759a48bd66c50d4560164b92e2197009a41f67728e66" Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.292512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" event={"ID":"353a1d8f-2f85-430e-a2f5-adb617373c35","Type":"ContainerStarted","Data":"eb908f147041f60adc56d3a16019817561ee7b05ee304e3c0ca14c3697d6e82d"} Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.292650 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.294873 4778 generic.go:334] "Generic (PLEG): container finished" podID="4206afc7-5079-41a3-887c-62f6083aa72c" containerID="08d85a2e1cdaa63fa3ff21fc3cae356be68eaa98a557e65636e331fbfbc8f25b" exitCode=0 Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.294929 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f16f-account-create-n47bv" event={"ID":"4206afc7-5079-41a3-887c-62f6083aa72c","Type":"ContainerDied","Data":"08d85a2e1cdaa63fa3ff21fc3cae356be68eaa98a557e65636e331fbfbc8f25b"} Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.294987 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f16f-account-create-n47bv" event={"ID":"4206afc7-5079-41a3-887c-62f6083aa72c","Type":"ContainerStarted","Data":"110343f20e73c5dad135f630dda51b46ec0012f94087ec06d9a7fb8ee0d9bb9b"} Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.318491 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" podStartSLOduration=11.318476104 podStartE2EDuration="11.318476104s" podCreationTimestamp="2025-09-30 17:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:07.316264207 +0000 UTC m=+986.306162020" watchObservedRunningTime="2025-09-30 17:34:07.318476104 +0000 UTC m=+986.308373907" Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.373204 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-hpqcf"] Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.375784 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb4695fc-hpqcf"] Sep 30 17:34:07 crc kubenswrapper[4778]: I0930 17:34:07.727322 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561dc04f-4607-420e-bb56-2f8bd70d477e" path="/var/lib/kubelet/pods/561dc04f-4607-420e-bb56-2f8bd70d477e/volumes" Sep 30 17:34:09 crc kubenswrapper[4778]: I0930 17:34:09.323998 4778 generic.go:334] "Generic (PLEG): container finished" podID="adeaa955-499d-477f-baf2-b68c6fa6c4d1" containerID="d7fa2809b29ef0fb7beb23c0ca2cc0508bca1d95a64f78a90519afaf4e3c1a3b" exitCode=0 Sep 30 17:34:09 crc kubenswrapper[4778]: I0930 17:34:09.324056 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tgccd" event={"ID":"adeaa955-499d-477f-baf2-b68c6fa6c4d1","Type":"ContainerDied","Data":"d7fa2809b29ef0fb7beb23c0ca2cc0508bca1d95a64f78a90519afaf4e3c1a3b"} Sep 30 17:34:09 crc kubenswrapper[4778]: I0930 17:34:09.732892 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:09 crc kubenswrapper[4778]: I0930 17:34:09.928282 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb8ps\" (UniqueName: \"kubernetes.io/projected/4206afc7-5079-41a3-887c-62f6083aa72c-kube-api-access-pb8ps\") pod \"4206afc7-5079-41a3-887c-62f6083aa72c\" (UID: \"4206afc7-5079-41a3-887c-62f6083aa72c\") " Sep 30 17:34:09 crc kubenswrapper[4778]: I0930 17:34:09.939341 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4206afc7-5079-41a3-887c-62f6083aa72c-kube-api-access-pb8ps" (OuterVolumeSpecName: "kube-api-access-pb8ps") pod "4206afc7-5079-41a3-887c-62f6083aa72c" (UID: "4206afc7-5079-41a3-887c-62f6083aa72c"). InnerVolumeSpecName "kube-api-access-pb8ps". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.030558 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb8ps\" (UniqueName: \"kubernetes.io/projected/4206afc7-5079-41a3-887c-62f6083aa72c-kube-api-access-pb8ps\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.333502 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f16f-account-create-n47bv" event={"ID":"4206afc7-5079-41a3-887c-62f6083aa72c","Type":"ContainerDied","Data":"110343f20e73c5dad135f630dda51b46ec0012f94087ec06d9a7fb8ee0d9bb9b"} Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.333578 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110343f20e73c5dad135f630dda51b46ec0012f94087ec06d9a7fb8ee0d9bb9b" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.333701 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f16f-account-create-n47bv" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.399203 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fsrx9"] Sep 30 17:34:10 crc kubenswrapper[4778]: E0930 17:34:10.400062 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561dc04f-4607-420e-bb56-2f8bd70d477e" containerName="init" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.400083 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="561dc04f-4607-420e-bb56-2f8bd70d477e" containerName="init" Sep 30 17:34:10 crc kubenswrapper[4778]: E0930 17:34:10.400141 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4206afc7-5079-41a3-887c-62f6083aa72c" containerName="mariadb-account-create" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.400150 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4206afc7-5079-41a3-887c-62f6083aa72c" containerName="mariadb-account-create" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.400340 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4206afc7-5079-41a3-887c-62f6083aa72c" containerName="mariadb-account-create" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.400376 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="561dc04f-4607-420e-bb56-2f8bd70d477e" containerName="init" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.400988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.410795 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.410973 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mc96g" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.411186 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.432992 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fsrx9"] Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.537527 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-scripts\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.538326 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ef0e458-8d92-4b76-8da4-24357b6911dc-etc-machine-id\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.538427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm9n6\" (UniqueName: \"kubernetes.io/projected/3ef0e458-8d92-4b76-8da4-24357b6911dc-kube-api-access-mm9n6\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.538547 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-combined-ca-bundle\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.538642 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-db-sync-config-data\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.538675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-config-data\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.640562 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-db-sync-config-data\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.640634 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-config-data\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.640701 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-scripts\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.640789 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ef0e458-8d92-4b76-8da4-24357b6911dc-etc-machine-id\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.640844 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm9n6\" (UniqueName: \"kubernetes.io/projected/3ef0e458-8d92-4b76-8da4-24357b6911dc-kube-api-access-mm9n6\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.640927 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-combined-ca-bundle\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.641738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ef0e458-8d92-4b76-8da4-24357b6911dc-etc-machine-id\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.648914 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-config-data\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.648945 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-scripts\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.659264 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-combined-ca-bundle\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.660267 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm9n6\" (UniqueName: \"kubernetes.io/projected/3ef0e458-8d92-4b76-8da4-24357b6911dc-kube-api-access-mm9n6\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.661825 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-db-sync-config-data\") pod \"cinder-db-sync-fsrx9\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:10 crc kubenswrapper[4778]: I0930 17:34:10.720007 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:34:12 crc kubenswrapper[4778]: I0930 17:34:12.351514 4778 generic.go:334] "Generic (PLEG): container finished" podID="e0280205-5cc0-45e5-8f2d-926b300cb348" containerID="22808c1658a4e26f82cff60e782fee9587b2047e51b29ff6a19d97a701a24271" exitCode=0 Sep 30 17:34:12 crc kubenswrapper[4778]: I0930 17:34:12.352129 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pk9vz" event={"ID":"e0280205-5cc0-45e5-8f2d-926b300cb348","Type":"ContainerDied","Data":"22808c1658a4e26f82cff60e782fee9587b2047e51b29ff6a19d97a701a24271"} Sep 30 17:34:12 crc kubenswrapper[4778]: I0930 17:34:12.402157 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:34:12 crc kubenswrapper[4778]: I0930 17:34:12.461159 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qx7rk"] Sep 30 17:34:12 crc kubenswrapper[4778]: I0930 17:34:12.461372 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerName="dnsmasq-dns" containerID="cri-o://2c4a9a0a37751c1633e3d86e563d0d9cc2e97d7eabaf965b40c82b1cbc9de113" gracePeriod=10 Sep 30 17:34:13 crc kubenswrapper[4778]: I0930 17:34:13.363216 4778 generic.go:334] "Generic (PLEG): container finished" podID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerID="2c4a9a0a37751c1633e3d86e563d0d9cc2e97d7eabaf965b40c82b1cbc9de113" exitCode=0 Sep 30 17:34:13 crc kubenswrapper[4778]: I0930 17:34:13.363323 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" event={"ID":"bb6d0d28-eda0-4390-bea4-7ba28fff79a7","Type":"ContainerDied","Data":"2c4a9a0a37751c1633e3d86e563d0d9cc2e97d7eabaf965b40c82b1cbc9de113"} Sep 30 17:34:14 crc kubenswrapper[4778]: I0930 17:34:14.812660 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:34:14 crc kubenswrapper[4778]: I0930 17:34:14.813307 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:34:14 crc kubenswrapper[4778]: I0930 17:34:14.994853 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pk9vz" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.027117 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.126649 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-scripts\") pod \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127074 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp27n\" (UniqueName: \"kubernetes.io/projected/adeaa955-499d-477f-baf2-b68c6fa6c4d1-kube-api-access-qp27n\") pod \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127220 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-combined-ca-bundle\") pod \"e0280205-5cc0-45e5-8f2d-926b300cb348\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127273 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-fernet-keys\") pod \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127300 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-credential-keys\") pod \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127327 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-config-data\") pod \"e0280205-5cc0-45e5-8f2d-926b300cb348\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127379 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-combined-ca-bundle\") pod \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127411 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-config-data\") pod \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\" (UID: \"adeaa955-499d-477f-baf2-b68c6fa6c4d1\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127459 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/e0280205-5cc0-45e5-8f2d-926b300cb348-kube-api-access-swgrm\") pod \"e0280205-5cc0-45e5-8f2d-926b300cb348\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.127486 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-db-sync-config-data\") pod \"e0280205-5cc0-45e5-8f2d-926b300cb348\" (UID: \"e0280205-5cc0-45e5-8f2d-926b300cb348\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.134439 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-scripts" (OuterVolumeSpecName: "scripts") pod "adeaa955-499d-477f-baf2-b68c6fa6c4d1" (UID: "adeaa955-499d-477f-baf2-b68c6fa6c4d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.142402 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e0280205-5cc0-45e5-8f2d-926b300cb348" (UID: "e0280205-5cc0-45e5-8f2d-926b300cb348"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.142438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adeaa955-499d-477f-baf2-b68c6fa6c4d1-kube-api-access-qp27n" (OuterVolumeSpecName: "kube-api-access-qp27n") pod "adeaa955-499d-477f-baf2-b68c6fa6c4d1" (UID: "adeaa955-499d-477f-baf2-b68c6fa6c4d1"). InnerVolumeSpecName "kube-api-access-qp27n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.142521 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "adeaa955-499d-477f-baf2-b68c6fa6c4d1" (UID: "adeaa955-499d-477f-baf2-b68c6fa6c4d1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.145282 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0280205-5cc0-45e5-8f2d-926b300cb348-kube-api-access-swgrm" (OuterVolumeSpecName: "kube-api-access-swgrm") pod "e0280205-5cc0-45e5-8f2d-926b300cb348" (UID: "e0280205-5cc0-45e5-8f2d-926b300cb348"). InnerVolumeSpecName "kube-api-access-swgrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.167423 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "adeaa955-499d-477f-baf2-b68c6fa6c4d1" (UID: "adeaa955-499d-477f-baf2-b68c6fa6c4d1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.199575 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "adeaa955-499d-477f-baf2-b68c6fa6c4d1" (UID: "adeaa955-499d-477f-baf2-b68c6fa6c4d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.200149 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-config-data" (OuterVolumeSpecName: "config-data") pod "adeaa955-499d-477f-baf2-b68c6fa6c4d1" (UID: "adeaa955-499d-477f-baf2-b68c6fa6c4d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229424 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229456 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229467 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229476 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229486 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swgrm\" (UniqueName: \"kubernetes.io/projected/e0280205-5cc0-45e5-8f2d-926b300cb348-kube-api-access-swgrm\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229495 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229503 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/adeaa955-499d-477f-baf2-b68c6fa6c4d1-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.229512 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp27n\" (UniqueName: \"kubernetes.io/projected/adeaa955-499d-477f-baf2-b68c6fa6c4d1-kube-api-access-qp27n\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.254028 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0280205-5cc0-45e5-8f2d-926b300cb348" (UID: "e0280205-5cc0-45e5-8f2d-926b300cb348"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.263912 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.317925 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-config-data" (OuterVolumeSpecName: "config-data") pod "e0280205-5cc0-45e5-8f2d-926b300cb348" (UID: "e0280205-5cc0-45e5-8f2d-926b300cb348"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.335497 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.335544 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0280205-5cc0-45e5-8f2d-926b300cb348-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.400647 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bfb67f5f-wzjct" event={"ID":"65f393a8-cd69-4aff-b73f-96416be3f843","Type":"ContainerStarted","Data":"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.406106 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d9465564-vcjl8" event={"ID":"b48eff09-cd27-48b0-9633-0fd28d43e0a0","Type":"ContainerStarted","Data":"f46ea7bb14934f472ac14ea511b321663dabc3b03e06c1ab29ba50f82a78b301"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.413383 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmzbx" event={"ID":"d593f115-5fd5-4db3-9244-8c90696317c4","Type":"ContainerStarted","Data":"e64124d4ae4a42703521c628147e5049512a8e8b4191312ed43b78166529eb40"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.419581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-tgccd" event={"ID":"adeaa955-499d-477f-baf2-b68c6fa6c4d1","Type":"ContainerDied","Data":"48f966fd9d6256a7980a9f68a08c264139173c4a14751d2ea59fe60677c0af1c"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.419630 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48f966fd9d6256a7980a9f68a08c264139173c4a14751d2ea59fe60677c0af1c" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.420019 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-tgccd" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.426017 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cdc899d97-c5xnr" event={"ID":"3e093cbb-cd23-4a15-8159-fa514c992d60","Type":"ContainerStarted","Data":"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.429289 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-fmzbx" podStartSLOduration=10.185261461 podStartE2EDuration="19.429276367s" podCreationTimestamp="2025-09-30 17:33:56 +0000 UTC" firstStartedPulling="2025-09-30 17:34:05.692591746 +0000 UTC m=+984.682489549" lastFinishedPulling="2025-09-30 17:34:14.936606652 +0000 UTC m=+993.926504455" observedRunningTime="2025-09-30 17:34:15.425901833 +0000 UTC m=+994.415799626" watchObservedRunningTime="2025-09-30 17:34:15.429276367 +0000 UTC m=+994.419174160" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.437816 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7gxf\" (UniqueName: \"kubernetes.io/projected/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-kube-api-access-m7gxf\") pod \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.437966 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-dns-svc\") pod \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.438001 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-nb\") pod \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.438083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-config\") pod \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.438113 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-sb\") pod \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\" (UID: \"bb6d0d28-eda0-4390-bea4-7ba28fff79a7\") " Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.442169 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fsrx9"] Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.443268 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.443835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-qx7rk" event={"ID":"bb6d0d28-eda0-4390-bea4-7ba28fff79a7","Type":"ContainerDied","Data":"58f0f5ffe739bb917107f52f938b8b65bfa54ddc93319a1bab3af6fbe2db83e9"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.443903 4778 scope.go:117] "RemoveContainer" containerID="2c4a9a0a37751c1633e3d86e563d0d9cc2e97d7eabaf965b40c82b1cbc9de113" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.448312 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pk9vz" event={"ID":"e0280205-5cc0-45e5-8f2d-926b300cb348","Type":"ContainerDied","Data":"7a365050019582519fbdcc016e44af1c1bdc1e86431f3e6101cf0c3c5d0be8fc"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.448347 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a365050019582519fbdcc016e44af1c1bdc1e86431f3e6101cf0c3c5d0be8fc" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.448427 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pk9vz" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.448656 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-kube-api-access-m7gxf" (OuterVolumeSpecName: "kube-api-access-m7gxf") pod "bb6d0d28-eda0-4390-bea4-7ba28fff79a7" (UID: "bb6d0d28-eda0-4390-bea4-7ba28fff79a7"). InnerVolumeSpecName "kube-api-access-m7gxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.454134 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7888848bff-9rdhr" event={"ID":"6ff8b4ac-2295-4913-8aa5-f0fb158b7521","Type":"ContainerStarted","Data":"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39"} Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.454277 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7888848bff-9rdhr" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon-log" containerID="cri-o://57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39" gracePeriod=30 Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.454324 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7888848bff-9rdhr" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon" containerID="cri-o://49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894" gracePeriod=30 Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.494646 4778 scope.go:117] "RemoveContainer" containerID="26dafc618861cb8faee3114255dfa7ef13691ea3eada354c149657512fa944fc" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.499149 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7888848bff-9rdhr" podStartSLOduration=10.00463496 podStartE2EDuration="19.499137484s" podCreationTimestamp="2025-09-30 17:33:56 +0000 UTC" firstStartedPulling="2025-09-30 17:34:05.449046619 +0000 UTC m=+984.438944422" lastFinishedPulling="2025-09-30 17:34:14.943549133 +0000 UTC m=+993.933446946" observedRunningTime="2025-09-30 17:34:15.472610727 +0000 UTC m=+994.462508530" watchObservedRunningTime="2025-09-30 17:34:15.499137484 +0000 UTC m=+994.489035287" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.541455 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7gxf\" (UniqueName: \"kubernetes.io/projected/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-kube-api-access-m7gxf\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.581874 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bb6d0d28-eda0-4390-bea4-7ba28fff79a7" (UID: "bb6d0d28-eda0-4390-bea4-7ba28fff79a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.581883 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bb6d0d28-eda0-4390-bea4-7ba28fff79a7" (UID: "bb6d0d28-eda0-4390-bea4-7ba28fff79a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609357 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-nz4tj"] Sep 30 17:34:15 crc kubenswrapper[4778]: E0930 17:34:15.609658 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerName="dnsmasq-dns" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609670 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerName="dnsmasq-dns" Sep 30 17:34:15 crc kubenswrapper[4778]: E0930 17:34:15.609703 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerName="init" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609709 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerName="init" Sep 30 17:34:15 crc kubenswrapper[4778]: E0930 17:34:15.609718 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adeaa955-499d-477f-baf2-b68c6fa6c4d1" containerName="keystone-bootstrap" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609724 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="adeaa955-499d-477f-baf2-b68c6fa6c4d1" containerName="keystone-bootstrap" Sep 30 17:34:15 crc kubenswrapper[4778]: E0930 17:34:15.609738 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0280205-5cc0-45e5-8f2d-926b300cb348" containerName="glance-db-sync" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609743 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0280205-5cc0-45e5-8f2d-926b300cb348" containerName="glance-db-sync" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609878 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="adeaa955-499d-477f-baf2-b68c6fa6c4d1" containerName="keystone-bootstrap" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609894 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0280205-5cc0-45e5-8f2d-926b300cb348" containerName="glance-db-sync" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.609910 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" containerName="dnsmasq-dns" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.611275 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.615071 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.615328 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-w4pf2" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.615445 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.631854 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nz4tj"] Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.643243 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.643281 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.663137 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bb6d0d28-eda0-4390-bea4-7ba28fff79a7" (UID: "bb6d0d28-eda0-4390-bea4-7ba28fff79a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.690918 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-config" (OuterVolumeSpecName: "config") pod "bb6d0d28-eda0-4390-bea4-7ba28fff79a7" (UID: "bb6d0d28-eda0-4390-bea4-7ba28fff79a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.745038 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-combined-ca-bundle\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.745127 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-config\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.745210 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmkht\" (UniqueName: \"kubernetes.io/projected/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-kube-api-access-gmkht\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.745330 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.745346 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb6d0d28-eda0-4390-bea4-7ba28fff79a7-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.773731 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qx7rk"] Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.783392 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-qx7rk"] Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.846815 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-combined-ca-bundle\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.846885 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-config\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.846987 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmkht\" (UniqueName: \"kubernetes.io/projected/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-kube-api-access-gmkht\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.853282 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-config\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.853275 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-combined-ca-bundle\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.864321 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmkht\" (UniqueName: \"kubernetes.io/projected/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-kube-api-access-gmkht\") pod \"neutron-db-sync-nz4tj\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:15 crc kubenswrapper[4778]: I0930 17:34:15.968063 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.222903 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-tgccd"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.256513 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-tgccd"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.299962 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pz5ss"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.308178 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.316031 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.316133 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.316277 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.317073 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zmxkr" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.337517 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-nz4tj"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.367775 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pz5ss"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.426691 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncscw\" (UniqueName: \"kubernetes.io/projected/92387610-886b-4105-8de1-87fa92e2215e-kube-api-access-ncscw\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.426798 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-fernet-keys\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.426855 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-scripts\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.426896 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-config-data\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.441761 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-combined-ca-bundle\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.441883 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-credential-keys\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.500979 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67bfb67f5f-wzjct" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon-log" containerID="cri-o://2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3" gracePeriod=30 Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.501084 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67bfb67f5f-wzjct" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon" containerID="cri-o://83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf" gracePeriod=30 Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.501123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bfb67f5f-wzjct" event={"ID":"65f393a8-cd69-4aff-b73f-96416be3f843","Type":"ContainerStarted","Data":"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.539788 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zhq6z"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.544169 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncscw\" (UniqueName: \"kubernetes.io/projected/92387610-886b-4105-8de1-87fa92e2215e-kube-api-access-ncscw\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.544232 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-fernet-keys\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.545319 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-scripts\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.545369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-config-data\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.545410 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-combined-ca-bundle\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.545453 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-credential-keys\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.547788 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.566243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-credential-keys\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.566528 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-config-data\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.566897 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-fernet-keys\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.572084 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-scripts\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.582232 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-combined-ca-bundle\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.583499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncscw\" (UniqueName: \"kubernetes.io/projected/92387610-886b-4105-8de1-87fa92e2215e-kube-api-access-ncscw\") pod \"keystone-bootstrap-pz5ss\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.588432 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d6b5dcb-zrl5s" event={"ID":"fcc69ce6-ef1c-42fc-a94e-daed766ec91d","Type":"ContainerStarted","Data":"73a74a7ab2b398bea7d852622fe6c0a37986265f89f57521a534f2e6b17210f0"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.588482 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f9d6b5dcb-zrl5s" event={"ID":"fcc69ce6-ef1c-42fc-a94e-daed766ec91d","Type":"ContainerStarted","Data":"6effa2d5c350d6f4b569ccdd8ed2853ca313df8fb2648bcdab8202b02bb73ef7"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.592107 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zhq6z"] Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.592300 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67bfb67f5f-wzjct" podStartSLOduration=11.298886957 podStartE2EDuration="20.592280327s" podCreationTimestamp="2025-09-30 17:33:56 +0000 UTC" firstStartedPulling="2025-09-30 17:34:05.677873878 +0000 UTC m=+984.667771681" lastFinishedPulling="2025-09-30 17:34:14.971267248 +0000 UTC m=+993.961165051" observedRunningTime="2025-09-30 17:34:16.528117113 +0000 UTC m=+995.518014916" watchObservedRunningTime="2025-09-30 17:34:16.592280327 +0000 UTC m=+995.582178130" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.651357 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d9465564-vcjl8" event={"ID":"b48eff09-cd27-48b0-9633-0fd28d43e0a0","Type":"ContainerStarted","Data":"e621921148275d7f14a235d97f55f5194ed9f470a27069ec84c67cab8d3f9b84"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.652499 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f9d6b5dcb-zrl5s" podStartSLOduration=2.67817575 podStartE2EDuration="11.652477921s" podCreationTimestamp="2025-09-30 17:34:05 +0000 UTC" firstStartedPulling="2025-09-30 17:34:06.147376187 +0000 UTC m=+985.137274000" lastFinishedPulling="2025-09-30 17:34:15.121678368 +0000 UTC m=+994.111576171" observedRunningTime="2025-09-30 17:34:16.629365026 +0000 UTC m=+995.619262829" watchObservedRunningTime="2025-09-30 17:34:16.652477921 +0000 UTC m=+995.642375724" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.653389 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.653527 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-config\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.653650 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq48r\" (UniqueName: \"kubernetes.io/projected/1fb952cc-e13f-4021-b29e-0d37280a67bc-kube-api-access-tq48r\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.653754 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.653851 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.657416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cdc899d97-c5xnr" event={"ID":"3e093cbb-cd23-4a15-8159-fa514c992d60","Type":"ContainerStarted","Data":"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.657571 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cdc899d97-c5xnr" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon-log" containerID="cri-o://f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4" gracePeriod=30 Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.657818 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5cdc899d97-c5xnr" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon" containerID="cri-o://b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba" gracePeriod=30 Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.665295 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fsrx9" event={"ID":"3ef0e458-8d92-4b76-8da4-24357b6911dc","Type":"ContainerStarted","Data":"c479550acad54299941fa0cd3359e34fc4df9ec7eb3d3c0a3f22eef9c74b56f9"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.667698 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7888848bff-9rdhr" event={"ID":"6ff8b4ac-2295-4913-8aa5-f0fb158b7521","Type":"ContainerStarted","Data":"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.670272 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nz4tj" event={"ID":"9fcfd6aa-7524-4fb2-8230-f35ed39691a9","Type":"ContainerStarted","Data":"afbb9053f7242982fb338f8f243c8c411912ece34db279085f7a0b10d2d7179b"} Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.701804 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67d9465564-vcjl8" podStartSLOduration=2.722334664 podStartE2EDuration="11.701778762s" podCreationTimestamp="2025-09-30 17:34:05 +0000 UTC" firstStartedPulling="2025-09-30 17:34:06.096718394 +0000 UTC m=+985.086616197" lastFinishedPulling="2025-09-30 17:34:15.076162492 +0000 UTC m=+994.066060295" observedRunningTime="2025-09-30 17:34:16.700368878 +0000 UTC m=+995.690266681" watchObservedRunningTime="2025-09-30 17:34:16.701778762 +0000 UTC m=+995.691676565" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.748658 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-nz4tj" podStartSLOduration=1.748597348 podStartE2EDuration="1.748597348s" podCreationTimestamp="2025-09-30 17:34:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:16.717605334 +0000 UTC m=+995.707503137" watchObservedRunningTime="2025-09-30 17:34:16.748597348 +0000 UTC m=+995.738495151" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.754469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.755576 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.755655 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.755808 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.755884 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-config\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.755929 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq48r\" (UniqueName: \"kubernetes.io/projected/1fb952cc-e13f-4021-b29e-0d37280a67bc-kube-api-access-tq48r\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.761988 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.762195 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.763002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-config\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.773271 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.779739 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq48r\" (UniqueName: \"kubernetes.io/projected/1fb952cc-e13f-4021-b29e-0d37280a67bc-kube-api-access-tq48r\") pod \"dnsmasq-dns-7987f74bbc-zhq6z\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.784101 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5cdc899d97-c5xnr" podStartSLOduration=9.515124694 podStartE2EDuration="18.784084249s" podCreationTimestamp="2025-09-30 17:33:58 +0000 UTC" firstStartedPulling="2025-09-30 17:34:05.699421074 +0000 UTC m=+984.689318877" lastFinishedPulling="2025-09-30 17:34:14.968380629 +0000 UTC m=+993.958278432" observedRunningTime="2025-09-30 17:34:16.742892634 +0000 UTC m=+995.732790437" watchObservedRunningTime="2025-09-30 17:34:16.784084249 +0000 UTC m=+995.773982052" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.888720 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.893223 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:16 crc kubenswrapper[4778]: I0930 17:34:16.977938 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.270604 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pz5ss"] Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.363929 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.365305 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.368838 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.368941 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v8k48" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.374191 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.376904 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.450874 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zhq6z"] Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.472778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-config-data\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.472813 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-logs\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.473006 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-scripts\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.473050 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.473080 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrtn\" (UniqueName: \"kubernetes.io/projected/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-kube-api-access-sqrtn\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.473107 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.473135 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574309 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-config-data\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574583 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-logs\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574661 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-scripts\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574710 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrtn\" (UniqueName: \"kubernetes.io/projected/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-kube-api-access-sqrtn\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574732 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.574751 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.575153 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.575289 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.576002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-logs\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.582522 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-scripts\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.586364 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.589116 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-config-data\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.592158 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrtn\" (UniqueName: \"kubernetes.io/projected/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-kube-api-access-sqrtn\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.600038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.643826 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.646002 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.650812 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.668170 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.703955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" event={"ID":"1fb952cc-e13f-4021-b29e-0d37280a67bc","Type":"ContainerStarted","Data":"c3e5b812141e1581c21543059dea70439a6d08eca2a415c91e5d45835b86d757"} Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.708115 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nz4tj" event={"ID":"9fcfd6aa-7524-4fb2-8230-f35ed39691a9","Type":"ContainerStarted","Data":"0cb1f5e4af2e98ea8536703ab065645ebe25a13bacf52de3595ee646acd2e076"} Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.730313 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adeaa955-499d-477f-baf2-b68c6fa6c4d1" path="/var/lib/kubelet/pods/adeaa955-499d-477f-baf2-b68c6fa6c4d1/volumes" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.731490 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb6d0d28-eda0-4390-bea4-7ba28fff79a7" path="/var/lib/kubelet/pods/bb6d0d28-eda0-4390-bea4-7ba28fff79a7/volumes" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.732066 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pz5ss" event={"ID":"92387610-886b-4105-8de1-87fa92e2215e","Type":"ContainerStarted","Data":"f135482638939de05acdcb30bc45fe93fe6c629b8c27502568aa448986935ace"} Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.732096 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pz5ss" event={"ID":"92387610-886b-4105-8de1-87fa92e2215e","Type":"ContainerStarted","Data":"e9603dd55f41de66279d58943e141c95ba2be7a22c9c1f286b99e98627d499c0"} Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.746516 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.747372 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pz5ss" podStartSLOduration=1.747330585 podStartE2EDuration="1.747330585s" podCreationTimestamp="2025-09-30 17:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:17.740100595 +0000 UTC m=+996.729998398" watchObservedRunningTime="2025-09-30 17:34:17.747330585 +0000 UTC m=+996.737228388" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.777794 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.777873 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwqnh\" (UniqueName: \"kubernetes.io/projected/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-kube-api-access-mwqnh\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.778107 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.778300 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.778537 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.778726 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.778821 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.892405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.897346 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.897961 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.898314 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwqnh\" (UniqueName: \"kubernetes.io/projected/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-kube-api-access-mwqnh\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.898493 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.899006 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.899190 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.899715 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.900846 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.902675 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-logs\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.905699 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.906850 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.923644 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.924232 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwqnh\" (UniqueName: \"kubernetes.io/projected/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-kube-api-access-mwqnh\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.966262 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:17 crc kubenswrapper[4778]: I0930 17:34:17.982974 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.355802 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:18 crc kubenswrapper[4778]: W0930 17:34:18.376847 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a9d722_f84a_4cf9_bcb4_0b4a79e2c95e.slice/crio-a8b4b6d93cbb22112a4c896172da3bc49cfdaafb42ea6a6a8e945cd3ef956c30 WatchSource:0}: Error finding container a8b4b6d93cbb22112a4c896172da3bc49cfdaafb42ea6a6a8e945cd3ef956c30: Status 404 returned error can't find the container with id a8b4b6d93cbb22112a4c896172da3bc49cfdaafb42ea6a6a8e945cd3ef956c30 Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.601668 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.730200 4778 generic.go:334] "Generic (PLEG): container finished" podID="d593f115-5fd5-4db3-9244-8c90696317c4" containerID="e64124d4ae4a42703521c628147e5049512a8e8b4191312ed43b78166529eb40" exitCode=0 Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.730285 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmzbx" event={"ID":"d593f115-5fd5-4db3-9244-8c90696317c4","Type":"ContainerDied","Data":"e64124d4ae4a42703521c628147e5049512a8e8b4191312ed43b78166529eb40"} Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.734313 4778 generic.go:334] "Generic (PLEG): container finished" podID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerID="63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496" exitCode=0 Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.734380 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" event={"ID":"1fb952cc-e13f-4021-b29e-0d37280a67bc","Type":"ContainerDied","Data":"63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496"} Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.735438 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e","Type":"ContainerStarted","Data":"a8b4b6d93cbb22112a4c896172da3bc49cfdaafb42ea6a6a8e945cd3ef956c30"} Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.741580 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed2ef4c-b8e2-455f-99ef-8856a63112fa","Type":"ContainerStarted","Data":"1bb333843f1ddb684b29d770c64eabd4dfdfedffa2b47eefc49a95b624deb9b4"} Sep 30 17:34:18 crc kubenswrapper[4778]: I0930 17:34:18.973848 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:34:19 crc kubenswrapper[4778]: I0930 17:34:19.408347 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:19 crc kubenswrapper[4778]: I0930 17:34:19.464074 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:20 crc kubenswrapper[4778]: I0930 17:34:20.926983 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmzbx" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.079731 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-scripts\") pod \"d593f115-5fd5-4db3-9244-8c90696317c4\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.080092 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/d593f115-5fd5-4db3-9244-8c90696317c4-kube-api-access-lfrxz\") pod \"d593f115-5fd5-4db3-9244-8c90696317c4\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.080223 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-combined-ca-bundle\") pod \"d593f115-5fd5-4db3-9244-8c90696317c4\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.080332 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-config-data\") pod \"d593f115-5fd5-4db3-9244-8c90696317c4\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.080353 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d593f115-5fd5-4db3-9244-8c90696317c4-logs\") pod \"d593f115-5fd5-4db3-9244-8c90696317c4\" (UID: \"d593f115-5fd5-4db3-9244-8c90696317c4\") " Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.081014 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d593f115-5fd5-4db3-9244-8c90696317c4-logs" (OuterVolumeSpecName: "logs") pod "d593f115-5fd5-4db3-9244-8c90696317c4" (UID: "d593f115-5fd5-4db3-9244-8c90696317c4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.089842 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-scripts" (OuterVolumeSpecName: "scripts") pod "d593f115-5fd5-4db3-9244-8c90696317c4" (UID: "d593f115-5fd5-4db3-9244-8c90696317c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.089903 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d593f115-5fd5-4db3-9244-8c90696317c4-kube-api-access-lfrxz" (OuterVolumeSpecName: "kube-api-access-lfrxz") pod "d593f115-5fd5-4db3-9244-8c90696317c4" (UID: "d593f115-5fd5-4db3-9244-8c90696317c4"). InnerVolumeSpecName "kube-api-access-lfrxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.113695 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d593f115-5fd5-4db3-9244-8c90696317c4" (UID: "d593f115-5fd5-4db3-9244-8c90696317c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.118251 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-config-data" (OuterVolumeSpecName: "config-data") pod "d593f115-5fd5-4db3-9244-8c90696317c4" (UID: "d593f115-5fd5-4db3-9244-8c90696317c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.182461 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.182492 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.182501 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d593f115-5fd5-4db3-9244-8c90696317c4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.182510 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d593f115-5fd5-4db3-9244-8c90696317c4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.182542 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfrxz\" (UniqueName: \"kubernetes.io/projected/d593f115-5fd5-4db3-9244-8c90696317c4-kube-api-access-lfrxz\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.797976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-fmzbx" event={"ID":"d593f115-5fd5-4db3-9244-8c90696317c4","Type":"ContainerDied","Data":"1a62f136a07a5e856880b9924d789b95f6811656aabc23fae29f1acec8550cf6"} Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.798012 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a62f136a07a5e856880b9924d789b95f6811656aabc23fae29f1acec8550cf6" Sep 30 17:34:21 crc kubenswrapper[4778]: I0930 17:34:21.798060 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-fmzbx" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.039847 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-768b5657b-6fmpz"] Sep 30 17:34:22 crc kubenswrapper[4778]: E0930 17:34:22.040199 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d593f115-5fd5-4db3-9244-8c90696317c4" containerName="placement-db-sync" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.040211 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="d593f115-5fd5-4db3-9244-8c90696317c4" containerName="placement-db-sync" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.040366 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="d593f115-5fd5-4db3-9244-8c90696317c4" containerName="placement-db-sync" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.041196 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.043582 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.043598 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.043634 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-6c4tb" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.043856 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.044541 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.051173 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-768b5657b-6fmpz"] Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104323 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-internal-tls-certs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104428 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-scripts\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104486 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-public-tls-certs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104508 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-config-data\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104576 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xklmf\" (UniqueName: \"kubernetes.io/projected/e9c1cfdd-d923-493f-a3b0-f75756047aeb-kube-api-access-xklmf\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104822 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c1cfdd-d923-493f-a3b0-f75756047aeb-logs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.104915 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-combined-ca-bundle\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207202 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-scripts\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207281 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-public-tls-certs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207307 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-config-data\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207355 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xklmf\" (UniqueName: \"kubernetes.io/projected/e9c1cfdd-d923-493f-a3b0-f75756047aeb-kube-api-access-xklmf\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207417 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c1cfdd-d923-493f-a3b0-f75756047aeb-logs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207448 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-combined-ca-bundle\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207507 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-internal-tls-certs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.207845 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9c1cfdd-d923-493f-a3b0-f75756047aeb-logs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.212801 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-config-data\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.213405 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-internal-tls-certs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.213508 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-scripts\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.215240 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-combined-ca-bundle\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.215693 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e9c1cfdd-d923-493f-a3b0-f75756047aeb-public-tls-certs\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.226171 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xklmf\" (UniqueName: \"kubernetes.io/projected/e9c1cfdd-d923-493f-a3b0-f75756047aeb-kube-api-access-xklmf\") pod \"placement-768b5657b-6fmpz\" (UID: \"e9c1cfdd-d923-493f-a3b0-f75756047aeb\") " pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:22 crc kubenswrapper[4778]: I0930 17:34:22.390337 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:24 crc kubenswrapper[4778]: I0930 17:34:24.823473 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e","Type":"ContainerStarted","Data":"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b"} Sep 30 17:34:24 crc kubenswrapper[4778]: I0930 17:34:24.826976 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" event={"ID":"1fb952cc-e13f-4021-b29e-0d37280a67bc","Type":"ContainerStarted","Data":"ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db"} Sep 30 17:34:24 crc kubenswrapper[4778]: I0930 17:34:24.827120 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:24 crc kubenswrapper[4778]: I0930 17:34:24.854809 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" podStartSLOduration=8.8547932 podStartE2EDuration="8.8547932s" podCreationTimestamp="2025-09-30 17:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:24.849081506 +0000 UTC m=+1003.838979309" watchObservedRunningTime="2025-09-30 17:34:24.8547932 +0000 UTC m=+1003.844691003" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.533219 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.533281 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.536282 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.134:8443: connect: connection refused" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.607474 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.607528 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.610288 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f9d6b5dcb-zrl5s" podUID="fcc69ce6-ef1c-42fc-a94e-daed766ec91d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.135:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.135:8443: connect: connection refused" Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.839096 4778 generic.go:334] "Generic (PLEG): container finished" podID="92387610-886b-4105-8de1-87fa92e2215e" containerID="f135482638939de05acdcb30bc45fe93fe6c629b8c27502568aa448986935ace" exitCode=0 Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.839190 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pz5ss" event={"ID":"92387610-886b-4105-8de1-87fa92e2215e","Type":"ContainerDied","Data":"f135482638939de05acdcb30bc45fe93fe6c629b8c27502568aa448986935ace"} Sep 30 17:34:25 crc kubenswrapper[4778]: I0930 17:34:25.840704 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed2ef4c-b8e2-455f-99ef-8856a63112fa","Type":"ContainerStarted","Data":"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c"} Sep 30 17:34:31 crc kubenswrapper[4778]: I0930 17:34:31.894849 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:31 crc kubenswrapper[4778]: I0930 17:34:31.969401 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xmpfk"] Sep 30 17:34:31 crc kubenswrapper[4778]: I0930 17:34:31.969656 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="dnsmasq-dns" containerID="cri-o://eb908f147041f60adc56d3a16019817561ee7b05ee304e3c0ca14c3697d6e82d" gracePeriod=10 Sep 30 17:34:32 crc kubenswrapper[4778]: I0930 17:34:32.400723 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Sep 30 17:34:33 crc kubenswrapper[4778]: I0930 17:34:33.005413 4778 generic.go:334] "Generic (PLEG): container finished" podID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerID="eb908f147041f60adc56d3a16019817561ee7b05ee304e3c0ca14c3697d6e82d" exitCode=0 Sep 30 17:34:33 crc kubenswrapper[4778]: I0930 17:34:33.005462 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" event={"ID":"353a1d8f-2f85-430e-a2f5-adb617373c35","Type":"ContainerDied","Data":"eb908f147041f60adc56d3a16019817561ee7b05ee304e3c0ca14c3697d6e82d"} Sep 30 17:34:35 crc kubenswrapper[4778]: I0930 17:34:35.533041 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.134:8443: connect: connection refused" Sep 30 17:34:35 crc kubenswrapper[4778]: I0930 17:34:35.608244 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-f9d6b5dcb-zrl5s" podUID="fcc69ce6-ef1c-42fc-a94e-daed766ec91d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.135:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.135:8443: connect: connection refused" Sep 30 17:34:37 crc kubenswrapper[4778]: I0930 17:34:37.400521 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.132:5353: connect: connection refused" Sep 30 17:34:39 crc kubenswrapper[4778]: E0930 17:34:39.035699 4778 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Sep 30 17:34:39 crc kubenswrapper[4778]: E0930 17:34:39.036075 4778 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mm9n6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fsrx9_openstack(3ef0e458-8d92-4b76-8da4-24357b6911dc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 17:34:39 crc kubenswrapper[4778]: E0930 17:34:39.037857 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fsrx9" podUID="3ef0e458-8d92-4b76-8da4-24357b6911dc" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.105220 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pz5ss" event={"ID":"92387610-886b-4105-8de1-87fa92e2215e","Type":"ContainerDied","Data":"e9603dd55f41de66279d58943e141c95ba2be7a22c9c1f286b99e98627d499c0"} Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.105261 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9603dd55f41de66279d58943e141c95ba2be7a22c9c1f286b99e98627d499c0" Sep 30 17:34:39 crc kubenswrapper[4778]: E0930 17:34:39.107158 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-fsrx9" podUID="3ef0e458-8d92-4b76-8da4-24357b6911dc" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.161247 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.339173 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncscw\" (UniqueName: \"kubernetes.io/projected/92387610-886b-4105-8de1-87fa92e2215e-kube-api-access-ncscw\") pod \"92387610-886b-4105-8de1-87fa92e2215e\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.339238 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-credential-keys\") pod \"92387610-886b-4105-8de1-87fa92e2215e\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.339277 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-config-data\") pod \"92387610-886b-4105-8de1-87fa92e2215e\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.339357 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-scripts\") pod \"92387610-886b-4105-8de1-87fa92e2215e\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.339399 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-fernet-keys\") pod \"92387610-886b-4105-8de1-87fa92e2215e\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.339439 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-combined-ca-bundle\") pod \"92387610-886b-4105-8de1-87fa92e2215e\" (UID: \"92387610-886b-4105-8de1-87fa92e2215e\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.345254 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "92387610-886b-4105-8de1-87fa92e2215e" (UID: "92387610-886b-4105-8de1-87fa92e2215e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.345346 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92387610-886b-4105-8de1-87fa92e2215e-kube-api-access-ncscw" (OuterVolumeSpecName: "kube-api-access-ncscw") pod "92387610-886b-4105-8de1-87fa92e2215e" (UID: "92387610-886b-4105-8de1-87fa92e2215e"). InnerVolumeSpecName "kube-api-access-ncscw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.345787 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "92387610-886b-4105-8de1-87fa92e2215e" (UID: "92387610-886b-4105-8de1-87fa92e2215e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.346297 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-scripts" (OuterVolumeSpecName: "scripts") pod "92387610-886b-4105-8de1-87fa92e2215e" (UID: "92387610-886b-4105-8de1-87fa92e2215e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.352844 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.378347 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92387610-886b-4105-8de1-87fa92e2215e" (UID: "92387610-886b-4105-8de1-87fa92e2215e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.378405 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-config-data" (OuterVolumeSpecName: "config-data") pod "92387610-886b-4105-8de1-87fa92e2215e" (UID: "92387610-886b-4105-8de1-87fa92e2215e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.440829 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.440865 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.440877 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncscw\" (UniqueName: \"kubernetes.io/projected/92387610-886b-4105-8de1-87fa92e2215e-kube-api-access-ncscw\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.440886 4778 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.440894 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.440902 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/92387610-886b-4105-8de1-87fa92e2215e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.495820 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-768b5657b-6fmpz"] Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.542265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5vkn\" (UniqueName: \"kubernetes.io/projected/353a1d8f-2f85-430e-a2f5-adb617373c35-kube-api-access-k5vkn\") pod \"353a1d8f-2f85-430e-a2f5-adb617373c35\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.542515 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-dns-svc\") pod \"353a1d8f-2f85-430e-a2f5-adb617373c35\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.542597 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-config\") pod \"353a1d8f-2f85-430e-a2f5-adb617373c35\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.542795 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-sb\") pod \"353a1d8f-2f85-430e-a2f5-adb617373c35\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.542908 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-nb\") pod \"353a1d8f-2f85-430e-a2f5-adb617373c35\" (UID: \"353a1d8f-2f85-430e-a2f5-adb617373c35\") " Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.552987 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/353a1d8f-2f85-430e-a2f5-adb617373c35-kube-api-access-k5vkn" (OuterVolumeSpecName: "kube-api-access-k5vkn") pod "353a1d8f-2f85-430e-a2f5-adb617373c35" (UID: "353a1d8f-2f85-430e-a2f5-adb617373c35"). InnerVolumeSpecName "kube-api-access-k5vkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.617607 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "353a1d8f-2f85-430e-a2f5-adb617373c35" (UID: "353a1d8f-2f85-430e-a2f5-adb617373c35"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.634733 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-config" (OuterVolumeSpecName: "config") pod "353a1d8f-2f85-430e-a2f5-adb617373c35" (UID: "353a1d8f-2f85-430e-a2f5-adb617373c35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.641658 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "353a1d8f-2f85-430e-a2f5-adb617373c35" (UID: "353a1d8f-2f85-430e-a2f5-adb617373c35"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.642371 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "353a1d8f-2f85-430e-a2f5-adb617373c35" (UID: "353a1d8f-2f85-430e-a2f5-adb617373c35"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.644829 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.644937 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.645005 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.645069 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/353a1d8f-2f85-430e-a2f5-adb617373c35-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:39 crc kubenswrapper[4778]: I0930 17:34:39.645132 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5vkn\" (UniqueName: \"kubernetes.io/projected/353a1d8f-2f85-430e-a2f5-adb617373c35-kube-api-access-k5vkn\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.115154 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-768b5657b-6fmpz" event={"ID":"e9c1cfdd-d923-493f-a3b0-f75756047aeb","Type":"ContainerStarted","Data":"52382b60c65270cca7156f408aff158d130b781116ad6546e1158bdc3a9840bc"} Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.115458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-768b5657b-6fmpz" event={"ID":"e9c1cfdd-d923-493f-a3b0-f75756047aeb","Type":"ContainerStarted","Data":"be68b0ff79e916a31d7982c58b0b04782a9da82bd7db7141ac1b418a325a1172"} Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.118069 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" event={"ID":"353a1d8f-2f85-430e-a2f5-adb617373c35","Type":"ContainerDied","Data":"f3ba683ed54c41e641fa3c19e29843300dcba2ed9c80b515d712256fe278beb7"} Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.118098 4778 scope.go:117] "RemoveContainer" containerID="eb908f147041f60adc56d3a16019817561ee7b05ee304e3c0ca14c3697d6e82d" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.118111 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745b9ddc8c-xmpfk" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.126721 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-log" containerID="cri-o://f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c" gracePeriod=30 Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.127229 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed2ef4c-b8e2-455f-99ef-8856a63112fa","Type":"ContainerStarted","Data":"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506"} Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.127456 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-httpd" containerID="cri-o://82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506" gracePeriod=30 Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.153159 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pz5ss" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.153182 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e","Type":"ContainerStarted","Data":"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4"} Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.153340 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-log" containerID="cri-o://7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b" gracePeriod=30 Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.153803 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-httpd" containerID="cri-o://4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4" gracePeriod=30 Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.156984 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=24.156969164 podStartE2EDuration="24.156969164s" podCreationTimestamp="2025-09-30 17:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:40.150487936 +0000 UTC m=+1019.140385729" watchObservedRunningTime="2025-09-30 17:34:40.156969164 +0000 UTC m=+1019.146866967" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.165028 4778 scope.go:117] "RemoveContainer" containerID="31b52749bbcf816877d77943f54cb53999c37d5e934da317b7876e652bbb4e3b" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.184122 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.18410268 podStartE2EDuration="24.18410268s" podCreationTimestamp="2025-09-30 17:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:40.177725165 +0000 UTC m=+1019.167622968" watchObservedRunningTime="2025-09-30 17:34:40.18410268 +0000 UTC m=+1019.174000483" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.208026 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xmpfk"] Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.218340 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745b9ddc8c-xmpfk"] Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.290822 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-56c95856-xw6pr"] Sep 30 17:34:40 crc kubenswrapper[4778]: E0930 17:34:40.291269 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92387610-886b-4105-8de1-87fa92e2215e" containerName="keystone-bootstrap" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.291290 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="92387610-886b-4105-8de1-87fa92e2215e" containerName="keystone-bootstrap" Sep 30 17:34:40 crc kubenswrapper[4778]: E0930 17:34:40.291323 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="dnsmasq-dns" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.291334 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="dnsmasq-dns" Sep 30 17:34:40 crc kubenswrapper[4778]: E0930 17:34:40.291362 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="init" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.291370 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="init" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.291592 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="92387610-886b-4105-8de1-87fa92e2215e" containerName="keystone-bootstrap" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.291658 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" containerName="dnsmasq-dns" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.292421 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.296823 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.297082 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.297226 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.297339 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zmxkr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.297441 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.302499 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.304994 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c95856-xw6pr"] Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.359725 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-config-data\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.359937 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtjh\" (UniqueName: \"kubernetes.io/projected/45cbc907-3ec0-4124-9a97-9e84c4f09145-kube-api-access-pbtjh\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.360058 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-internal-tls-certs\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.360089 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-public-tls-certs\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.360141 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-scripts\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.360266 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-combined-ca-bundle\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.360336 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-fernet-keys\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.360404 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-credential-keys\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461525 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-config-data\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461601 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtjh\" (UniqueName: \"kubernetes.io/projected/45cbc907-3ec0-4124-9a97-9e84c4f09145-kube-api-access-pbtjh\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-internal-tls-certs\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461666 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-public-tls-certs\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461691 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-scripts\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461731 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-combined-ca-bundle\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-fernet-keys\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.461796 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-credential-keys\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.469379 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-public-tls-certs\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.469733 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-credential-keys\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.469521 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-fernet-keys\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.469403 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-scripts\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.470754 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-internal-tls-certs\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.471216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-combined-ca-bundle\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.471752 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45cbc907-3ec0-4124-9a97-9e84c4f09145-config-data\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.481366 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtjh\" (UniqueName: \"kubernetes.io/projected/45cbc907-3ec0-4124-9a97-9e84c4f09145-kube-api-access-pbtjh\") pod \"keystone-56c95856-xw6pr\" (UID: \"45cbc907-3ec0-4124-9a97-9e84c4f09145\") " pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.618898 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.739110 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.869377 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-combined-ca-bundle\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.869572 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-scripts\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.870603 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-config-data\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.870676 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-httpd-run\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.870729 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-logs\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.870782 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwqnh\" (UniqueName: \"kubernetes.io/projected/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-kube-api-access-mwqnh\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.870842 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\" (UID: \"1ed2ef4c-b8e2-455f-99ef-8856a63112fa\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.871189 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.871235 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-logs" (OuterVolumeSpecName: "logs") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.873311 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.873338 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.875778 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-scripts" (OuterVolumeSpecName: "scripts") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.877229 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.879773 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-kube-api-access-mwqnh" (OuterVolumeSpecName: "kube-api-access-mwqnh") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "kube-api-access-mwqnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.880319 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.898403 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.915248 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-config-data" (OuterVolumeSpecName: "config-data") pod "1ed2ef4c-b8e2-455f-99ef-8856a63112fa" (UID: "1ed2ef4c-b8e2-455f-99ef-8856a63112fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.973695 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqrtn\" (UniqueName: \"kubernetes.io/projected/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-kube-api-access-sqrtn\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.973827 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-config-data\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.973880 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.973924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-logs\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.973945 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-httpd-run\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.973972 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-combined-ca-bundle\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974026 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-scripts\") pod \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\" (UID: \"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e\") " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974263 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974278 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974288 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwqnh\" (UniqueName: \"kubernetes.io/projected/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-kube-api-access-mwqnh\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974317 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974329 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ed2ef4c-b8e2-455f-99ef-8856a63112fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.974744 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-logs" (OuterVolumeSpecName: "logs") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.977710 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.978438 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.982693 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-scripts" (OuterVolumeSpecName: "scripts") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.993826 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-kube-api-access-sqrtn" (OuterVolumeSpecName: "kube-api-access-sqrtn") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "kube-api-access-sqrtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:40 crc kubenswrapper[4778]: I0930 17:34:40.997857 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.018271 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.035820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-config-data" (OuterVolumeSpecName: "config-data") pod "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" (UID: "24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075682 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075756 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075769 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075801 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075810 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075819 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075826 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqrtn\" (UniqueName: \"kubernetes.io/projected/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-kube-api-access-sqrtn\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.075836 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.097714 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.172472 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-56c95856-xw6pr"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.180377 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.188917 4778 generic.go:334] "Generic (PLEG): container finished" podID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerID="82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506" exitCode=0 Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.188959 4778 generic.go:334] "Generic (PLEG): container finished" podID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerID="f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c" exitCode=143 Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.188967 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed2ef4c-b8e2-455f-99ef-8856a63112fa","Type":"ContainerDied","Data":"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.189009 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed2ef4c-b8e2-455f-99ef-8856a63112fa","Type":"ContainerDied","Data":"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.189025 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1ed2ef4c-b8e2-455f-99ef-8856a63112fa","Type":"ContainerDied","Data":"1bb333843f1ddb684b29d770c64eabd4dfdfedffa2b47eefc49a95b624deb9b4"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.189043 4778 scope.go:117] "RemoveContainer" containerID="82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.189041 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.195689 4778 generic.go:334] "Generic (PLEG): container finished" podID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerID="4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4" exitCode=0 Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.195722 4778 generic.go:334] "Generic (PLEG): container finished" podID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerID="7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b" exitCode=143 Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.195752 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.195775 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e","Type":"ContainerDied","Data":"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.195803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e","Type":"ContainerDied","Data":"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.195820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e","Type":"ContainerDied","Data":"a8b4b6d93cbb22112a4c896172da3bc49cfdaafb42ea6a6a8e945cd3ef956c30"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.201783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-768b5657b-6fmpz" event={"ID":"e9c1cfdd-d923-493f-a3b0-f75756047aeb","Type":"ContainerStarted","Data":"92dc3b180856b03b9c78ce6f1a46cb085ac618bf82ef4dd5f86a1e79c66bd2b7"} Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.201928 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.225264 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-768b5657b-6fmpz" podStartSLOduration=19.225241248 podStartE2EDuration="19.225241248s" podCreationTimestamp="2025-09-30 17:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:41.221433663 +0000 UTC m=+1020.211331466" watchObservedRunningTime="2025-09-30 17:34:41.225241248 +0000 UTC m=+1020.215139051" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.268086 4778 scope.go:117] "RemoveContainer" containerID="f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.281441 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.294086 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.298134 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.306491 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.325423 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.326208 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-httpd" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326226 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-httpd" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.326262 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-log" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326270 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-log" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.326286 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-httpd" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326292 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-httpd" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.326323 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-log" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326330 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-log" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326694 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-httpd" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326727 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-log" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326741 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" containerName="glance-log" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.326750 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" containerName="glance-httpd" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.328495 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.332681 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.332749 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.332876 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.338737 4778 scope.go:117] "RemoveContainer" containerID="82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.338912 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v8k48" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.341254 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506\": container with ID starting with 82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506 not found: ID does not exist" containerID="82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.341605 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506"} err="failed to get container status \"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506\": rpc error: code = NotFound desc = could not find container \"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506\": container with ID starting with 82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506 not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.341753 4778 scope.go:117] "RemoveContainer" containerID="f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.342577 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c\": container with ID starting with f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c not found: ID does not exist" containerID="f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.342628 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c"} err="failed to get container status \"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c\": rpc error: code = NotFound desc = could not find container \"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c\": container with ID starting with f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.342652 4778 scope.go:117] "RemoveContainer" containerID="82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.343019 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506"} err="failed to get container status \"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506\": rpc error: code = NotFound desc = could not find container \"82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506\": container with ID starting with 82f06299e4d602877b354c266b7a60b055e1d77614dea465e5e14c70d6197506 not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.343037 4778 scope.go:117] "RemoveContainer" containerID="f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.343562 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c"} err="failed to get container status \"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c\": rpc error: code = NotFound desc = could not find container \"f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c\": container with ID starting with f675b5e4699860b2fc1e10c7c89f833e63a4f0bb5b4aeb0181cb0b19207d448c not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.346138 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.343639 4778 scope.go:117] "RemoveContainer" containerID="4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.360746 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.362708 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.363117 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.400753 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.407370 4778 scope.go:117] "RemoveContainer" containerID="7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.417850 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.435413 4778 scope.go:117] "RemoveContainer" containerID="4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.435972 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4\": container with ID starting with 4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4 not found: ID does not exist" containerID="4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.436027 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4"} err="failed to get container status \"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4\": rpc error: code = NotFound desc = could not find container \"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4\": container with ID starting with 4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4 not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.436099 4778 scope.go:117] "RemoveContainer" containerID="7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b" Sep 30 17:34:41 crc kubenswrapper[4778]: E0930 17:34:41.436493 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b\": container with ID starting with 7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b not found: ID does not exist" containerID="7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.436533 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b"} err="failed to get container status \"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b\": rpc error: code = NotFound desc = could not find container \"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b\": container with ID starting with 7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.436563 4778 scope.go:117] "RemoveContainer" containerID="4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.436874 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4"} err="failed to get container status \"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4\": rpc error: code = NotFound desc = could not find container \"4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4\": container with ID starting with 4e066467b97128a3d75e09711144555bf1f04bbe62253ff118b2b3c906d661b4 not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.436894 4778 scope.go:117] "RemoveContainer" containerID="7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.437077 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b"} err="failed to get container status \"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b\": rpc error: code = NotFound desc = could not find container \"7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b\": container with ID starting with 7133c9da7ddc70c466c553bd724f345e81f3b861f6b81405cb745a9a5fef294b not found: ID does not exist" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.504903 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tflz4\" (UniqueName: \"kubernetes.io/projected/02f673aa-9764-4ff8-af4c-42eeaefd2d05-kube-api-access-tflz4\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.504947 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.504978 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-scripts\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505238 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505365 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505443 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljsfz\" (UniqueName: \"kubernetes.io/projected/5fa61a80-4c21-425c-be4b-35ceac5c1d48-kube-api-access-ljsfz\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505541 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505770 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-config-data\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505809 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505889 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505914 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505933 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-logs\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505955 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.505990 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-logs\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.506090 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607227 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tflz4\" (UniqueName: \"kubernetes.io/projected/02f673aa-9764-4ff8-af4c-42eeaefd2d05-kube-api-access-tflz4\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607315 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-scripts\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607343 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607361 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607380 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljsfz\" (UniqueName: \"kubernetes.io/projected/5fa61a80-4c21-425c-be4b-35ceac5c1d48-kube-api-access-ljsfz\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607676 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607637 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607969 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.607685 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608129 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608190 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-config-data\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608216 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608269 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608307 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608341 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-logs\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-logs\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.608649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.609561 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-logs\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.612036 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.612641 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-logs\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.613394 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.621269 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-scripts\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.621508 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.622303 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.623108 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.624957 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.626245 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.628668 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-config-data\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.635188 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tflz4\" (UniqueName: \"kubernetes.io/projected/02f673aa-9764-4ff8-af4c-42eeaefd2d05-kube-api-access-tflz4\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.641985 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljsfz\" (UniqueName: \"kubernetes.io/projected/5fa61a80-4c21-425c-be4b-35ceac5c1d48-kube-api-access-ljsfz\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.673912 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " pod="openstack/glance-default-external-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.679015 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.727575 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v8k48" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.732593 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed2ef4c-b8e2-455f-99ef-8856a63112fa" path="/var/lib/kubelet/pods/1ed2ef4c-b8e2-455f-99ef-8856a63112fa/volumes" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.733335 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e" path="/var/lib/kubelet/pods/24a9d722-f84a-4cf9-bcb4-0b4a79e2c95e/volumes" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.733920 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="353a1d8f-2f85-430e-a2f5-adb617373c35" path="/var/lib/kubelet/pods/353a1d8f-2f85-430e-a2f5-adb617373c35/volumes" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.734953 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:41 crc kubenswrapper[4778]: I0930 17:34:41.976424 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.229047 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c95856-xw6pr" event={"ID":"45cbc907-3ec0-4124-9a97-9e84c4f09145","Type":"ContainerStarted","Data":"39f9666c98441cc49747927296a617739c54df1896dffb8ce8d34a83d67bbc2c"} Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.229100 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-56c95856-xw6pr" event={"ID":"45cbc907-3ec0-4124-9a97-9e84c4f09145","Type":"ContainerStarted","Data":"4f56269770e351a38ea0ea2fb8333d46f0e8db49789bd1c2906fc90c673884e2"} Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.229167 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.230326 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.261475 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-56c95856-xw6pr" podStartSLOduration=2.261456628 podStartE2EDuration="2.261456628s" podCreationTimestamp="2025-09-30 17:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:42.252297498 +0000 UTC m=+1021.242195311" watchObservedRunningTime="2025-09-30 17:34:42.261456628 +0000 UTC m=+1021.251354431" Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.307484 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:34:42 crc kubenswrapper[4778]: I0930 17:34:42.525509 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:34:43 crc kubenswrapper[4778]: I0930 17:34:43.266714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5fa61a80-4c21-425c-be4b-35ceac5c1d48","Type":"ContainerStarted","Data":"d73ad7a8c5107070fcea7006bd4439a8f4e64b8854a15902161f47775ee98fd7"} Sep 30 17:34:43 crc kubenswrapper[4778]: I0930 17:34:43.266755 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5fa61a80-4c21-425c-be4b-35ceac5c1d48","Type":"ContainerStarted","Data":"4229c9e36d43de023ca7c8e1cb141fa684a883b91fa9f0f8d3deffa3fa26afc7"} Sep 30 17:34:43 crc kubenswrapper[4778]: I0930 17:34:43.269923 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02f673aa-9764-4ff8-af4c-42eeaefd2d05","Type":"ContainerStarted","Data":"f78dc41c523244de92dbd3a1259d3cb4401e0a812fba5d6d8377e1001a06b0e7"} Sep 30 17:34:43 crc kubenswrapper[4778]: I0930 17:34:43.269956 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02f673aa-9764-4ff8-af4c-42eeaefd2d05","Type":"ContainerStarted","Data":"eb0aee333b62778fd1bcc2aa54f7ca550eb399fd537cc272d574c067f6bf9db9"} Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.286167 4778 generic.go:334] "Generic (PLEG): container finished" podID="9fcfd6aa-7524-4fb2-8230-f35ed39691a9" containerID="0cb1f5e4af2e98ea8536703ab065645ebe25a13bacf52de3595ee646acd2e076" exitCode=0 Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.286308 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nz4tj" event={"ID":"9fcfd6aa-7524-4fb2-8230-f35ed39691a9","Type":"ContainerDied","Data":"0cb1f5e4af2e98ea8536703ab065645ebe25a13bacf52de3595ee646acd2e076"} Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.292314 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02f673aa-9764-4ff8-af4c-42eeaefd2d05","Type":"ContainerStarted","Data":"14a58edb9355c8a4b88820e76d44c0d225196d8d2e1c0e6a91d9a8ae7b960902"} Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.295463 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5fa61a80-4c21-425c-be4b-35ceac5c1d48","Type":"ContainerStarted","Data":"b5a435f83dc96d7d29c1d79b339d933b9c293bfdfabdfef8845954ecf9e7fdf4"} Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.346365 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.346338355 podStartE2EDuration="3.346338355s" podCreationTimestamp="2025-09-30 17:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:44.339395403 +0000 UTC m=+1023.329293216" watchObservedRunningTime="2025-09-30 17:34:44.346338355 +0000 UTC m=+1023.336236158" Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.373919 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.373893764 podStartE2EDuration="3.373893764s" podCreationTimestamp="2025-09-30 17:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:44.363596211 +0000 UTC m=+1023.353494054" watchObservedRunningTime="2025-09-30 17:34:44.373893764 +0000 UTC m=+1023.363791587" Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.811714 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:34:44 crc kubenswrapper[4778]: I0930 17:34:44.811866 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.720051 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.793270 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-combined-ca-bundle\") pod \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.793651 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-config\") pod \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.793714 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmkht\" (UniqueName: \"kubernetes.io/projected/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-kube-api-access-gmkht\") pod \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\" (UID: \"9fcfd6aa-7524-4fb2-8230-f35ed39691a9\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.811733 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-kube-api-access-gmkht" (OuterVolumeSpecName: "kube-api-access-gmkht") pod "9fcfd6aa-7524-4fb2-8230-f35ed39691a9" (UID: "9fcfd6aa-7524-4fb2-8230-f35ed39691a9"). InnerVolumeSpecName "kube-api-access-gmkht". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.819982 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-config" (OuterVolumeSpecName: "config") pod "9fcfd6aa-7524-4fb2-8230-f35ed39691a9" (UID: "9fcfd6aa-7524-4fb2-8230-f35ed39691a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.827590 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fcfd6aa-7524-4fb2-8230-f35ed39691a9" (UID: "9fcfd6aa-7524-4fb2-8230-f35ed39691a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.847060 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.895467 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-logs\") pod \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.895531 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-scripts\") pod \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.895657 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgtmm\" (UniqueName: \"kubernetes.io/projected/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-kube-api-access-bgtmm\") pod \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.895734 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-horizon-secret-key\") pod \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.895930 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-config-data\") pod \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\" (UID: \"6ff8b4ac-2295-4913-8aa5-f0fb158b7521\") " Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.896232 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-logs" (OuterVolumeSpecName: "logs") pod "6ff8b4ac-2295-4913-8aa5-f0fb158b7521" (UID: "6ff8b4ac-2295-4913-8aa5-f0fb158b7521"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.896372 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.896391 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmkht\" (UniqueName: \"kubernetes.io/projected/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-kube-api-access-gmkht\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.896404 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.896412 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fcfd6aa-7524-4fb2-8230-f35ed39691a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.899511 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-kube-api-access-bgtmm" (OuterVolumeSpecName: "kube-api-access-bgtmm") pod "6ff8b4ac-2295-4913-8aa5-f0fb158b7521" (UID: "6ff8b4ac-2295-4913-8aa5-f0fb158b7521"). InnerVolumeSpecName "kube-api-access-bgtmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.901717 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "6ff8b4ac-2295-4913-8aa5-f0fb158b7521" (UID: "6ff8b4ac-2295-4913-8aa5-f0fb158b7521"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.920977 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-scripts" (OuterVolumeSpecName: "scripts") pod "6ff8b4ac-2295-4913-8aa5-f0fb158b7521" (UID: "6ff8b4ac-2295-4913-8aa5-f0fb158b7521"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.932511 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-config-data" (OuterVolumeSpecName: "config-data") pod "6ff8b4ac-2295-4913-8aa5-f0fb158b7521" (UID: "6ff8b4ac-2295-4913-8aa5-f0fb158b7521"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.998351 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.998384 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgtmm\" (UniqueName: \"kubernetes.io/projected/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-kube-api-access-bgtmm\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.998394 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:45 crc kubenswrapper[4778]: I0930 17:34:45.998403 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6ff8b4ac-2295-4913-8aa5-f0fb158b7521-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.317826 4778 generic.go:334] "Generic (PLEG): container finished" podID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerID="49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894" exitCode=137 Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.317873 4778 generic.go:334] "Generic (PLEG): container finished" podID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerID="57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39" exitCode=137 Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.317936 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7888848bff-9rdhr" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.317954 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7888848bff-9rdhr" event={"ID":"6ff8b4ac-2295-4913-8aa5-f0fb158b7521","Type":"ContainerDied","Data":"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894"} Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.317996 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7888848bff-9rdhr" event={"ID":"6ff8b4ac-2295-4913-8aa5-f0fb158b7521","Type":"ContainerDied","Data":"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39"} Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.318015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7888848bff-9rdhr" event={"ID":"6ff8b4ac-2295-4913-8aa5-f0fb158b7521","Type":"ContainerDied","Data":"7fc8da9c7d3fda25aa37fce9a9a33d52c6abfd94c947ea1c9f6c41056457fbb1"} Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.318040 4778 scope.go:117] "RemoveContainer" containerID="49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.320456 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-nz4tj" event={"ID":"9fcfd6aa-7524-4fb2-8230-f35ed39691a9","Type":"ContainerDied","Data":"afbb9053f7242982fb338f8f243c8c411912ece34db279085f7a0b10d2d7179b"} Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.320487 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afbb9053f7242982fb338f8f243c8c411912ece34db279085f7a0b10d2d7179b" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.320549 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-nz4tj" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.459925 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7888848bff-9rdhr"] Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.470497 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7888848bff-9rdhr"] Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507155 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-wgw8f"] Sep 30 17:34:46 crc kubenswrapper[4778]: E0930 17:34:46.507463 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcfd6aa-7524-4fb2-8230-f35ed39691a9" containerName="neutron-db-sync" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507474 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcfd6aa-7524-4fb2-8230-f35ed39691a9" containerName="neutron-db-sync" Sep 30 17:34:46 crc kubenswrapper[4778]: E0930 17:34:46.507493 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon-log" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507499 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon-log" Sep 30 17:34:46 crc kubenswrapper[4778]: E0930 17:34:46.507524 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507530 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507725 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon-log" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507741 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcfd6aa-7524-4fb2-8230-f35ed39691a9" containerName="neutron-db-sync" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.507748 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" containerName="horizon" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.508569 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.523444 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-wgw8f"] Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.572969 4778 scope.go:117] "RemoveContainer" containerID="57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.623583 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j48j5\" (UniqueName: \"kubernetes.io/projected/c316357a-faa3-4a48-9e86-0f9138b01f76-kube-api-access-j48j5\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.623778 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.623816 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.623896 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-config\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.626697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-dns-svc\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.645974 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dcd84db9b-hvcqd"] Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.647915 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.650208 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.651067 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-w4pf2" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.651322 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.653538 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.666688 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dcd84db9b-hvcqd"] Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737516 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j48j5\" (UniqueName: \"kubernetes.io/projected/c316357a-faa3-4a48-9e86-0f9138b01f76-kube-api-access-j48j5\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737570 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-ovndb-tls-certs\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737603 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-httpd-config\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737649 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737668 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kvm\" (UniqueName: \"kubernetes.io/projected/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-kube-api-access-c5kvm\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737727 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-config\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737746 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-dns-svc\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737762 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-combined-ca-bundle\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.737788 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-config\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.738883 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.738903 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.739090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-config\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.739289 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-dns-svc\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.754132 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j48j5\" (UniqueName: \"kubernetes.io/projected/c316357a-faa3-4a48-9e86-0f9138b01f76-kube-api-access-j48j5\") pod \"dnsmasq-dns-7b946d459c-wgw8f\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.840866 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kvm\" (UniqueName: \"kubernetes.io/projected/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-kube-api-access-c5kvm\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.841427 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-combined-ca-bundle\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.841469 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-config\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.841558 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-ovndb-tls-certs\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.841611 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-httpd-config\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.846750 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-combined-ca-bundle\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.847520 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-config\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.848898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-ovndb-tls-certs\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.849515 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-httpd-config\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.858284 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.869041 4778 scope.go:117] "RemoveContainer" containerID="49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.871744 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kvm\" (UniqueName: \"kubernetes.io/projected/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-kube-api-access-c5kvm\") pod \"neutron-6dcd84db9b-hvcqd\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:46 crc kubenswrapper[4778]: E0930 17:34:46.872985 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894\": container with ID starting with 49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894 not found: ID does not exist" containerID="49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.873023 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894"} err="failed to get container status \"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894\": rpc error: code = NotFound desc = could not find container \"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894\": container with ID starting with 49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894 not found: ID does not exist" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.873045 4778 scope.go:117] "RemoveContainer" containerID="57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39" Sep 30 17:34:46 crc kubenswrapper[4778]: E0930 17:34:46.880681 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39\": container with ID starting with 57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39 not found: ID does not exist" containerID="57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.880740 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39"} err="failed to get container status \"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39\": rpc error: code = NotFound desc = could not find container \"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39\": container with ID starting with 57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39 not found: ID does not exist" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.880770 4778 scope.go:117] "RemoveContainer" containerID="49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.881799 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894"} err="failed to get container status \"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894\": rpc error: code = NotFound desc = could not find container \"49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894\": container with ID starting with 49b73b5a70aca8c5c5008f71f174a28f352908b694d029872ec80ecddb58b894 not found: ID does not exist" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.881823 4778 scope.go:117] "RemoveContainer" containerID="57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39" Sep 30 17:34:46 crc kubenswrapper[4778]: I0930 17:34:46.884501 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39"} err="failed to get container status \"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39\": rpc error: code = NotFound desc = could not find container \"57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39\": container with ID starting with 57b228b7af7320d15037a46a78b0691dbd9f192499f35fa1f493b6e2d64bad39 not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.019332 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.151215 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65f393a8-cd69-4aff-b73f-96416be3f843-horizon-secret-key\") pod \"65f393a8-cd69-4aff-b73f-96416be3f843\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.151575 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-config-data\") pod \"65f393a8-cd69-4aff-b73f-96416be3f843\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.151691 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-scripts\") pod \"65f393a8-cd69-4aff-b73f-96416be3f843\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.151753 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f393a8-cd69-4aff-b73f-96416be3f843-logs\") pod \"65f393a8-cd69-4aff-b73f-96416be3f843\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.151808 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5kzh\" (UniqueName: \"kubernetes.io/projected/65f393a8-cd69-4aff-b73f-96416be3f843-kube-api-access-d5kzh\") pod \"65f393a8-cd69-4aff-b73f-96416be3f843\" (UID: \"65f393a8-cd69-4aff-b73f-96416be3f843\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.159922 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f393a8-cd69-4aff-b73f-96416be3f843-logs" (OuterVolumeSpecName: "logs") pod "65f393a8-cd69-4aff-b73f-96416be3f843" (UID: "65f393a8-cd69-4aff-b73f-96416be3f843"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.164235 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f393a8-cd69-4aff-b73f-96416be3f843-kube-api-access-d5kzh" (OuterVolumeSpecName: "kube-api-access-d5kzh") pod "65f393a8-cd69-4aff-b73f-96416be3f843" (UID: "65f393a8-cd69-4aff-b73f-96416be3f843"). InnerVolumeSpecName "kube-api-access-d5kzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.165699 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.178006 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f393a8-cd69-4aff-b73f-96416be3f843-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "65f393a8-cd69-4aff-b73f-96416be3f843" (UID: "65f393a8-cd69-4aff-b73f-96416be3f843"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.220585 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-config-data" (OuterVolumeSpecName: "config-data") pod "65f393a8-cd69-4aff-b73f-96416be3f843" (UID: "65f393a8-cd69-4aff-b73f-96416be3f843"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.253351 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5kzh\" (UniqueName: \"kubernetes.io/projected/65f393a8-cd69-4aff-b73f-96416be3f843-kube-api-access-d5kzh\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.253381 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/65f393a8-cd69-4aff-b73f-96416be3f843-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.253391 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.253400 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f393a8-cd69-4aff-b73f-96416be3f843-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.295277 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-scripts" (OuterVolumeSpecName: "scripts") pod "65f393a8-cd69-4aff-b73f-96416be3f843" (UID: "65f393a8-cd69-4aff-b73f-96416be3f843"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.308570 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.352220 4778 generic.go:334] "Generic (PLEG): container finished" podID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerID="b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba" exitCode=137 Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.352253 4778 generic.go:334] "Generic (PLEG): container finished" podID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerID="f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4" exitCode=137 Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.352373 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.353119 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cdc899d97-c5xnr" event={"ID":"3e093cbb-cd23-4a15-8159-fa514c992d60","Type":"ContainerDied","Data":"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba"} Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.353146 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cdc899d97-c5xnr" event={"ID":"3e093cbb-cd23-4a15-8159-fa514c992d60","Type":"ContainerDied","Data":"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4"} Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.353156 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5cdc899d97-c5xnr" event={"ID":"3e093cbb-cd23-4a15-8159-fa514c992d60","Type":"ContainerDied","Data":"b2323e3fd40f80dd2ac26428faf5687504c197caa5027b511acfa813a96f2ef6"} Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.353170 4778 scope.go:117] "RemoveContainer" containerID="b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.354469 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-config-data\") pod \"3e093cbb-cd23-4a15-8159-fa514c992d60\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.354590 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e093cbb-cd23-4a15-8159-fa514c992d60-horizon-secret-key\") pod \"3e093cbb-cd23-4a15-8159-fa514c992d60\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.354647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lr5dh\" (UniqueName: \"kubernetes.io/projected/3e093cbb-cd23-4a15-8159-fa514c992d60-kube-api-access-lr5dh\") pod \"3e093cbb-cd23-4a15-8159-fa514c992d60\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.354761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e093cbb-cd23-4a15-8159-fa514c992d60-logs\") pod \"3e093cbb-cd23-4a15-8159-fa514c992d60\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.354803 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-scripts\") pod \"3e093cbb-cd23-4a15-8159-fa514c992d60\" (UID: \"3e093cbb-cd23-4a15-8159-fa514c992d60\") " Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.355121 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65f393a8-cd69-4aff-b73f-96416be3f843-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.356906 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e093cbb-cd23-4a15-8159-fa514c992d60-logs" (OuterVolumeSpecName: "logs") pod "3e093cbb-cd23-4a15-8159-fa514c992d60" (UID: "3e093cbb-cd23-4a15-8159-fa514c992d60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.359772 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e093cbb-cd23-4a15-8159-fa514c992d60-kube-api-access-lr5dh" (OuterVolumeSpecName: "kube-api-access-lr5dh") pod "3e093cbb-cd23-4a15-8159-fa514c992d60" (UID: "3e093cbb-cd23-4a15-8159-fa514c992d60"). InnerVolumeSpecName "kube-api-access-lr5dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.362017 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e093cbb-cd23-4a15-8159-fa514c992d60-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "3e093cbb-cd23-4a15-8159-fa514c992d60" (UID: "3e093cbb-cd23-4a15-8159-fa514c992d60"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.376004 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-scripts" (OuterVolumeSpecName: "scripts") pod "3e093cbb-cd23-4a15-8159-fa514c992d60" (UID: "3e093cbb-cd23-4a15-8159-fa514c992d60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.384839 4778 generic.go:334] "Generic (PLEG): container finished" podID="65f393a8-cd69-4aff-b73f-96416be3f843" containerID="83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf" exitCode=137 Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.384872 4778 generic.go:334] "Generic (PLEG): container finished" podID="65f393a8-cd69-4aff-b73f-96416be3f843" containerID="2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3" exitCode=137 Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.384896 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bfb67f5f-wzjct" event={"ID":"65f393a8-cd69-4aff-b73f-96416be3f843","Type":"ContainerDied","Data":"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf"} Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.384925 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bfb67f5f-wzjct" event={"ID":"65f393a8-cd69-4aff-b73f-96416be3f843","Type":"ContainerDied","Data":"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3"} Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.384938 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67bfb67f5f-wzjct" event={"ID":"65f393a8-cd69-4aff-b73f-96416be3f843","Type":"ContainerDied","Data":"5c7531fe62eb0d906f02a24c8709aa176577774b3fc0ad7e81d00cf31399316d"} Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.385002 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67bfb67f5f-wzjct" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.392357 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-config-data" (OuterVolumeSpecName: "config-data") pod "3e093cbb-cd23-4a15-8159-fa514c992d60" (UID: "3e093cbb-cd23-4a15-8159-fa514c992d60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.456092 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.456120 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/3e093cbb-cd23-4a15-8159-fa514c992d60-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.456132 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lr5dh\" (UniqueName: \"kubernetes.io/projected/3e093cbb-cd23-4a15-8159-fa514c992d60-kube-api-access-lr5dh\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.456141 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3e093cbb-cd23-4a15-8159-fa514c992d60-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.456149 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e093cbb-cd23-4a15-8159-fa514c992d60-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.457075 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67bfb67f5f-wzjct"] Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.469041 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67bfb67f5f-wzjct"] Sep 30 17:34:47 crc kubenswrapper[4778]: E0930 17:34:47.508724 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f393a8_cd69_4aff_b73f_96416be3f843.slice\": RecentStats: unable to find data in memory cache]" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.573013 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-wgw8f"] Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.616813 4778 scope.go:117] "RemoveContainer" containerID="f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.724892 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" path="/var/lib/kubelet/pods/65f393a8-cd69-4aff-b73f-96416be3f843/volumes" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.725822 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff8b4ac-2295-4913-8aa5-f0fb158b7521" path="/var/lib/kubelet/pods/6ff8b4ac-2295-4913-8aa5-f0fb158b7521/volumes" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.738647 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.763761 4778 scope.go:117] "RemoveContainer" containerID="b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba" Sep 30 17:34:47 crc kubenswrapper[4778]: E0930 17:34:47.765087 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba\": container with ID starting with b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba not found: ID does not exist" containerID="b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.765120 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba"} err="failed to get container status \"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba\": rpc error: code = NotFound desc = could not find container \"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba\": container with ID starting with b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.765152 4778 scope.go:117] "RemoveContainer" containerID="f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4" Sep 30 17:34:47 crc kubenswrapper[4778]: E0930 17:34:47.766353 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4\": container with ID starting with f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4 not found: ID does not exist" containerID="f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.766386 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4"} err="failed to get container status \"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4\": rpc error: code = NotFound desc = could not find container \"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4\": container with ID starting with f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4 not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.766406 4778 scope.go:117] "RemoveContainer" containerID="b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.766727 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba"} err="failed to get container status \"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba\": rpc error: code = NotFound desc = could not find container \"b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba\": container with ID starting with b3f1646749b5808d8482310fa139516a7b146453989095663ffefbd39ff8cfba not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.766755 4778 scope.go:117] "RemoveContainer" containerID="f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.767013 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4"} err="failed to get container status \"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4\": rpc error: code = NotFound desc = could not find container \"f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4\": container with ID starting with f7aed90d2edc48b3193b975e001132568f96928d86e6153eb27ef8155cb0f0c4 not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.767037 4778 scope.go:117] "RemoveContainer" containerID="83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.867787 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dcd84db9b-hvcqd"] Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.971320 4778 scope.go:117] "RemoveContainer" containerID="2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3" Sep 30 17:34:47 crc kubenswrapper[4778]: W0930 17:34:47.980411 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaa8a326_cf9a_4788_9e58_7c8bfbe92962.slice/crio-4c7608b9ac4ec80e9c038e49f44d3ecb2a0492c4d771d350537eab7565f63b78 WatchSource:0}: Error finding container 4c7608b9ac4ec80e9c038e49f44d3ecb2a0492c4d771d350537eab7565f63b78: Status 404 returned error can't find the container with id 4c7608b9ac4ec80e9c038e49f44d3ecb2a0492c4d771d350537eab7565f63b78 Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.988663 4778 scope.go:117] "RemoveContainer" containerID="83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf" Sep 30 17:34:47 crc kubenswrapper[4778]: E0930 17:34:47.989199 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf\": container with ID starting with 83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf not found: ID does not exist" containerID="83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.989309 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf"} err="failed to get container status \"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf\": rpc error: code = NotFound desc = could not find container \"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf\": container with ID starting with 83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.989404 4778 scope.go:117] "RemoveContainer" containerID="2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3" Sep 30 17:34:47 crc kubenswrapper[4778]: E0930 17:34:47.990639 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3\": container with ID starting with 2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3 not found: ID does not exist" containerID="2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.990756 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3"} err="failed to get container status \"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3\": rpc error: code = NotFound desc = could not find container \"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3\": container with ID starting with 2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3 not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.990861 4778 scope.go:117] "RemoveContainer" containerID="83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.991230 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf"} err="failed to get container status \"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf\": rpc error: code = NotFound desc = could not find container \"83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf\": container with ID starting with 83efc49054ecafaa815e24ea3203ee713cdd182a4228c776548dfc895dba08bf not found: ID does not exist" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.991326 4778 scope.go:117] "RemoveContainer" containerID="2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3" Sep 30 17:34:47 crc kubenswrapper[4778]: I0930 17:34:47.991724 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3"} err="failed to get container status \"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3\": rpc error: code = NotFound desc = could not find container \"2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3\": container with ID starting with 2e4e0363e3866fab653c5416c205e23d10381e5617948812f57c56485adc91a3 not found: ID does not exist" Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.046314 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.394572 4778 generic.go:334] "Generic (PLEG): container finished" podID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerID="c10362c784a5abc55288b5514b67d8c30d1ac8317acb8b3c292ba47bff60203a" exitCode=0 Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.394657 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" event={"ID":"c316357a-faa3-4a48-9e86-0f9138b01f76","Type":"ContainerDied","Data":"c10362c784a5abc55288b5514b67d8c30d1ac8317acb8b3c292ba47bff60203a"} Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.394891 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" event={"ID":"c316357a-faa3-4a48-9e86-0f9138b01f76","Type":"ContainerStarted","Data":"bf4f74b6a156a9834ea41fcb8ec84b7b41e0e34e1d8a8fc7b131c582254c3f33"} Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.401894 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcd84db9b-hvcqd" event={"ID":"aaa8a326-cf9a-4788-9e58-7c8bfbe92962","Type":"ContainerStarted","Data":"7e4454346fd4b2af1658fd800cf2d4d1bd2bf7f514806f9c012b9b4014efe4d7"} Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.401942 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcd84db9b-hvcqd" event={"ID":"aaa8a326-cf9a-4788-9e58-7c8bfbe92962","Type":"ContainerStarted","Data":"fa3179aa1770f2ea120d9f140816cf257ced5072452c736c4961c586d309688c"} Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.401954 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcd84db9b-hvcqd" event={"ID":"aaa8a326-cf9a-4788-9e58-7c8bfbe92962","Type":"ContainerStarted","Data":"4c7608b9ac4ec80e9c038e49f44d3ecb2a0492c4d771d350537eab7565f63b78"} Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.402062 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:34:48 crc kubenswrapper[4778]: I0930 17:34:48.438056 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dcd84db9b-hvcqd" podStartSLOduration=2.4380421820000002 podStartE2EDuration="2.438042182s" podCreationTimestamp="2025-09-30 17:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:48.436702441 +0000 UTC m=+1027.426600294" watchObservedRunningTime="2025-09-30 17:34:48.438042182 +0000 UTC m=+1027.427939985" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.410149 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" event={"ID":"c316357a-faa3-4a48-9e86-0f9138b01f76","Type":"ContainerStarted","Data":"c58eeacc1efa12e3f03e7c69fc7e4db1e7e76bfd5ab48878df0e3d0064a25639"} Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.432510 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" podStartSLOduration=3.432486509 podStartE2EDuration="3.432486509s" podCreationTimestamp="2025-09-30 17:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:49.427981741 +0000 UTC m=+1028.417879564" watchObservedRunningTime="2025-09-30 17:34:49.432486509 +0000 UTC m=+1028.422384322" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.602452 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-f9d6b5dcb-zrl5s" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.661774 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-98b979bf7-grpws"] Sep 30 17:34:49 crc kubenswrapper[4778]: E0930 17:34:49.662144 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon-log" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662160 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon-log" Sep 30 17:34:49 crc kubenswrapper[4778]: E0930 17:34:49.662171 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662177 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon" Sep 30 17:34:49 crc kubenswrapper[4778]: E0930 17:34:49.662189 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662195 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon" Sep 30 17:34:49 crc kubenswrapper[4778]: E0930 17:34:49.662216 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon-log" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662222 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon-log" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662365 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon-log" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662379 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" containerName="horizon" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662391 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon-log" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.662411 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f393a8-cd69-4aff-b73f-96416be3f843" containerName="horizon" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.663255 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.672508 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.672527 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.691959 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98b979bf7-grpws"] Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.701701 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-httpd-config\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.701807 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-ovndb-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.701846 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-combined-ca-bundle\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.701886 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-internal-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.701931 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cks7\" (UniqueName: \"kubernetes.io/projected/613850bf-ac7d-4c00-bc60-873582d3d45e-kube-api-access-4cks7\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.702041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-public-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.702101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-config\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.714676 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67d9465564-vcjl8"] Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.715004 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon-log" containerID="cri-o://f46ea7bb14934f472ac14ea511b321663dabc3b03e06c1ab29ba50f82a78b301" gracePeriod=30 Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.715475 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" containerID="cri-o://e621921148275d7f14a235d97f55f5194ed9f470a27069ec84c67cab8d3f9b84" gracePeriod=30 Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.724894 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.733411 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:35712->10.217.0.134:8443: read: connection reset by peer" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803136 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-internal-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803179 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cks7\" (UniqueName: \"kubernetes.io/projected/613850bf-ac7d-4c00-bc60-873582d3d45e-kube-api-access-4cks7\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803248 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-public-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803278 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-config\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803331 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-httpd-config\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803371 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-ovndb-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.803397 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-combined-ca-bundle\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.810322 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-httpd-config\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.811191 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-public-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.812299 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-combined-ca-bundle\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.818400 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-ovndb-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.823122 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-config\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.823385 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/613850bf-ac7d-4c00-bc60-873582d3d45e-internal-tls-certs\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:49 crc kubenswrapper[4778]: I0930 17:34:49.825927 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cks7\" (UniqueName: \"kubernetes.io/projected/613850bf-ac7d-4c00-bc60-873582d3d45e-kube-api-access-4cks7\") pod \"neutron-98b979bf7-grpws\" (UID: \"613850bf-ac7d-4c00-bc60-873582d3d45e\") " pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:50 crc kubenswrapper[4778]: I0930 17:34:50.003027 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:50 crc kubenswrapper[4778]: I0930 17:34:50.416276 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:50 crc kubenswrapper[4778]: I0930 17:34:50.647401 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-98b979bf7-grpws"] Sep 30 17:34:50 crc kubenswrapper[4778]: W0930 17:34:50.656834 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod613850bf_ac7d_4c00_bc60_873582d3d45e.slice/crio-4980845e1ad07d932f1059ce44e00a0e946cbaf285d99780ad5947b4cf4802eb WatchSource:0}: Error finding container 4980845e1ad07d932f1059ce44e00a0e946cbaf285d99780ad5947b4cf4802eb: Status 404 returned error can't find the container with id 4980845e1ad07d932f1059ce44e00a0e946cbaf285d99780ad5947b4cf4802eb Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.427988 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b979bf7-grpws" event={"ID":"613850bf-ac7d-4c00-bc60-873582d3d45e","Type":"ContainerStarted","Data":"b417d86fa7d1a56d3e3478a530e5acc30d2c5fe867601c6c3b1fa6bdd0d74dc2"} Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.428539 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b979bf7-grpws" event={"ID":"613850bf-ac7d-4c00-bc60-873582d3d45e","Type":"ContainerStarted","Data":"69e4d7e962a60f0564dcc51501564dcf3d7a6422f170dba5fa3b24a545085533"} Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.428559 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-98b979bf7-grpws" event={"ID":"613850bf-ac7d-4c00-bc60-873582d3d45e","Type":"ContainerStarted","Data":"4980845e1ad07d932f1059ce44e00a0e946cbaf285d99780ad5947b4cf4802eb"} Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.456400 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-98b979bf7-grpws" podStartSLOduration=2.456385108 podStartE2EDuration="2.456385108s" podCreationTimestamp="2025-09-30 17:34:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:34:51.453533032 +0000 UTC m=+1030.443430835" watchObservedRunningTime="2025-09-30 17:34:51.456385108 +0000 UTC m=+1030.446282911" Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.735229 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.735297 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.780958 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.809403 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.977147 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 17:34:51 crc kubenswrapper[4778]: I0930 17:34:51.977200 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.018088 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.032036 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.442121 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.442165 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.442651 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.442825 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 17:34:52 crc kubenswrapper[4778]: I0930 17:34:52.442846 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 17:34:53 crc kubenswrapper[4778]: I0930 17:34:53.451461 4778 generic.go:334] "Generic (PLEG): container finished" podID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerID="e621921148275d7f14a235d97f55f5194ed9f470a27069ec84c67cab8d3f9b84" exitCode=0 Sep 30 17:34:53 crc kubenswrapper[4778]: I0930 17:34:53.451532 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d9465564-vcjl8" event={"ID":"b48eff09-cd27-48b0-9633-0fd28d43e0a0","Type":"ContainerDied","Data":"e621921148275d7f14a235d97f55f5194ed9f470a27069ec84c67cab8d3f9b84"} Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.459672 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.459707 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.459687 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.459811 4778 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.467114 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.469843 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-768b5657b-6fmpz" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.770341 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.775439 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.892674 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:54 crc kubenswrapper[4778]: I0930 17:34:54.894067 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 17:34:55 crc kubenswrapper[4778]: I0930 17:34:55.470483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fsrx9" event={"ID":"3ef0e458-8d92-4b76-8da4-24357b6911dc","Type":"ContainerStarted","Data":"d09d4fe9810398410fd612bca02872713eef7cd44997bab9fe75a6bd52e935db"} Sep 30 17:34:55 crc kubenswrapper[4778]: I0930 17:34:55.492796 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fsrx9" podStartSLOduration=6.51451816 podStartE2EDuration="45.492773951s" podCreationTimestamp="2025-09-30 17:34:10 +0000 UTC" firstStartedPulling="2025-09-30 17:34:15.499893628 +0000 UTC m=+994.489791431" lastFinishedPulling="2025-09-30 17:34:54.478149419 +0000 UTC m=+1033.468047222" observedRunningTime="2025-09-30 17:34:55.483376634 +0000 UTC m=+1034.473274457" watchObservedRunningTime="2025-09-30 17:34:55.492773951 +0000 UTC m=+1034.482671754" Sep 30 17:34:55 crc kubenswrapper[4778]: I0930 17:34:55.533448 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.134:8443: connect: connection refused" Sep 30 17:34:56 crc kubenswrapper[4778]: I0930 17:34:56.859810 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:34:56 crc kubenswrapper[4778]: I0930 17:34:56.935338 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zhq6z"] Sep 30 17:34:56 crc kubenswrapper[4778]: I0930 17:34:56.935871 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerName="dnsmasq-dns" containerID="cri-o://ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db" gracePeriod=10 Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.454466 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.501922 4778 generic.go:334] "Generic (PLEG): container finished" podID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerID="ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db" exitCode=0 Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.502109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" event={"ID":"1fb952cc-e13f-4021-b29e-0d37280a67bc","Type":"ContainerDied","Data":"ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db"} Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.502184 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" event={"ID":"1fb952cc-e13f-4021-b29e-0d37280a67bc","Type":"ContainerDied","Data":"c3e5b812141e1581c21543059dea70439a6d08eca2a415c91e5d45835b86d757"} Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.502215 4778 scope.go:117] "RemoveContainer" containerID="ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.502259 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-zhq6z" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.532868 4778 scope.go:117] "RemoveContainer" containerID="63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.559182 4778 scope.go:117] "RemoveContainer" containerID="ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db" Sep 30 17:34:57 crc kubenswrapper[4778]: E0930 17:34:57.559493 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db\": container with ID starting with ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db not found: ID does not exist" containerID="ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.559516 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db"} err="failed to get container status \"ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db\": rpc error: code = NotFound desc = could not find container \"ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db\": container with ID starting with ef37ba9fc55a1d367409150e0e8aa8b0c234947138c95593ea75f3cc7f40e9db not found: ID does not exist" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.559538 4778 scope.go:117] "RemoveContainer" containerID="63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496" Sep 30 17:34:57 crc kubenswrapper[4778]: E0930 17:34:57.559733 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496\": container with ID starting with 63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496 not found: ID does not exist" containerID="63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.559751 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496"} err="failed to get container status \"63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496\": rpc error: code = NotFound desc = could not find container \"63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496\": container with ID starting with 63b0c74fd0afda931a8c88bc3faaf31ae49b572fdd32f7aba1cbf3e06a492496 not found: ID does not exist" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.588579 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-config\") pod \"1fb952cc-e13f-4021-b29e-0d37280a67bc\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.588737 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-dns-svc\") pod \"1fb952cc-e13f-4021-b29e-0d37280a67bc\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.588785 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq48r\" (UniqueName: \"kubernetes.io/projected/1fb952cc-e13f-4021-b29e-0d37280a67bc-kube-api-access-tq48r\") pod \"1fb952cc-e13f-4021-b29e-0d37280a67bc\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.588806 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-nb\") pod \"1fb952cc-e13f-4021-b29e-0d37280a67bc\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.588837 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-sb\") pod \"1fb952cc-e13f-4021-b29e-0d37280a67bc\" (UID: \"1fb952cc-e13f-4021-b29e-0d37280a67bc\") " Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.593840 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fb952cc-e13f-4021-b29e-0d37280a67bc-kube-api-access-tq48r" (OuterVolumeSpecName: "kube-api-access-tq48r") pod "1fb952cc-e13f-4021-b29e-0d37280a67bc" (UID: "1fb952cc-e13f-4021-b29e-0d37280a67bc"). InnerVolumeSpecName "kube-api-access-tq48r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.634679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fb952cc-e13f-4021-b29e-0d37280a67bc" (UID: "1fb952cc-e13f-4021-b29e-0d37280a67bc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.641792 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fb952cc-e13f-4021-b29e-0d37280a67bc" (UID: "1fb952cc-e13f-4021-b29e-0d37280a67bc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.644394 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-config" (OuterVolumeSpecName: "config") pod "1fb952cc-e13f-4021-b29e-0d37280a67bc" (UID: "1fb952cc-e13f-4021-b29e-0d37280a67bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.645669 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fb952cc-e13f-4021-b29e-0d37280a67bc" (UID: "1fb952cc-e13f-4021-b29e-0d37280a67bc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.690901 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.690942 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq48r\" (UniqueName: \"kubernetes.io/projected/1fb952cc-e13f-4021-b29e-0d37280a67bc-kube-api-access-tq48r\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.690963 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.690975 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.690986 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fb952cc-e13f-4021-b29e-0d37280a67bc-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.824647 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zhq6z"] Sep 30 17:34:57 crc kubenswrapper[4778]: I0930 17:34:57.830089 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-zhq6z"] Sep 30 17:34:59 crc kubenswrapper[4778]: I0930 17:34:59.535809 4778 generic.go:334] "Generic (PLEG): container finished" podID="3ef0e458-8d92-4b76-8da4-24357b6911dc" containerID="d09d4fe9810398410fd612bca02872713eef7cd44997bab9fe75a6bd52e935db" exitCode=0 Sep 30 17:34:59 crc kubenswrapper[4778]: I0930 17:34:59.535904 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fsrx9" event={"ID":"3ef0e458-8d92-4b76-8da4-24357b6911dc","Type":"ContainerDied","Data":"d09d4fe9810398410fd612bca02872713eef7cd44997bab9fe75a6bd52e935db"} Sep 30 17:34:59 crc kubenswrapper[4778]: I0930 17:34:59.731750 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" path="/var/lib/kubelet/pods/1fb952cc-e13f-4021-b29e-0d37280a67bc/volumes" Sep 30 17:35:00 crc kubenswrapper[4778]: I0930 17:35:00.920290 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061003 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-config-data\") pod \"3ef0e458-8d92-4b76-8da4-24357b6911dc\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061087 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm9n6\" (UniqueName: \"kubernetes.io/projected/3ef0e458-8d92-4b76-8da4-24357b6911dc-kube-api-access-mm9n6\") pod \"3ef0e458-8d92-4b76-8da4-24357b6911dc\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061161 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ef0e458-8d92-4b76-8da4-24357b6911dc-etc-machine-id\") pod \"3ef0e458-8d92-4b76-8da4-24357b6911dc\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061228 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-db-sync-config-data\") pod \"3ef0e458-8d92-4b76-8da4-24357b6911dc\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061290 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ef0e458-8d92-4b76-8da4-24357b6911dc-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3ef0e458-8d92-4b76-8da4-24357b6911dc" (UID: "3ef0e458-8d92-4b76-8da4-24357b6911dc"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061341 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-scripts\") pod \"3ef0e458-8d92-4b76-8da4-24357b6911dc\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.061378 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-combined-ca-bundle\") pod \"3ef0e458-8d92-4b76-8da4-24357b6911dc\" (UID: \"3ef0e458-8d92-4b76-8da4-24357b6911dc\") " Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.062648 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ef0e458-8d92-4b76-8da4-24357b6911dc-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.072736 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef0e458-8d92-4b76-8da4-24357b6911dc-kube-api-access-mm9n6" (OuterVolumeSpecName: "kube-api-access-mm9n6") pod "3ef0e458-8d92-4b76-8da4-24357b6911dc" (UID: "3ef0e458-8d92-4b76-8da4-24357b6911dc"). InnerVolumeSpecName "kube-api-access-mm9n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.073435 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3ef0e458-8d92-4b76-8da4-24357b6911dc" (UID: "3ef0e458-8d92-4b76-8da4-24357b6911dc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.076729 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-scripts" (OuterVolumeSpecName: "scripts") pod "3ef0e458-8d92-4b76-8da4-24357b6911dc" (UID: "3ef0e458-8d92-4b76-8da4-24357b6911dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.107276 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3ef0e458-8d92-4b76-8da4-24357b6911dc" (UID: "3ef0e458-8d92-4b76-8da4-24357b6911dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.141758 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-config-data" (OuterVolumeSpecName: "config-data") pod "3ef0e458-8d92-4b76-8da4-24357b6911dc" (UID: "3ef0e458-8d92-4b76-8da4-24357b6911dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.164106 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.164194 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm9n6\" (UniqueName: \"kubernetes.io/projected/3ef0e458-8d92-4b76-8da4-24357b6911dc-kube-api-access-mm9n6\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.164217 4778 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.164235 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.164253 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ef0e458-8d92-4b76-8da4-24357b6911dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.563109 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fsrx9" event={"ID":"3ef0e458-8d92-4b76-8da4-24357b6911dc","Type":"ContainerDied","Data":"c479550acad54299941fa0cd3359e34fc4df9ec7eb3d3c0a3f22eef9c74b56f9"} Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.563173 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c479550acad54299941fa0cd3359e34fc4df9ec7eb3d3c0a3f22eef9c74b56f9" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.563209 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fsrx9" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.952603 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:01 crc kubenswrapper[4778]: E0930 17:35:01.953330 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef0e458-8d92-4b76-8da4-24357b6911dc" containerName="cinder-db-sync" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.953349 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef0e458-8d92-4b76-8da4-24357b6911dc" containerName="cinder-db-sync" Sep 30 17:35:01 crc kubenswrapper[4778]: E0930 17:35:01.953370 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerName="dnsmasq-dns" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.953378 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerName="dnsmasq-dns" Sep 30 17:35:01 crc kubenswrapper[4778]: E0930 17:35:01.953397 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerName="init" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.953404 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerName="init" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.953606 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef0e458-8d92-4b76-8da4-24357b6911dc" containerName="cinder-db-sync" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.953643 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fb952cc-e13f-4021-b29e-0d37280a67bc" containerName="dnsmasq-dns" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.956923 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.958733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.958895 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-mc96g" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.959226 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.959486 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 17:35:01 crc kubenswrapper[4778]: I0930 17:35:01.969740 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.016466 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-f9s5c"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.022646 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.031296 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-f9s5c"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.082207 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.082342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.082625 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.082687 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-scripts\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.082723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c8212-9e9e-4efb-b071-506f0dd66b60-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.082744 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq9r9\" (UniqueName: \"kubernetes.io/projected/6d7c8212-9e9e-4efb-b071-506f0dd66b60-kube-api-access-tq9r9\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.129803 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.133126 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.136698 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.147347 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184653 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-scripts\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184697 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-nb\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c8212-9e9e-4efb-b071-506f0dd66b60-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184745 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq9r9\" (UniqueName: \"kubernetes.io/projected/6d7c8212-9e9e-4efb-b071-506f0dd66b60-kube-api-access-tq9r9\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184762 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-config\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184802 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184837 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c8212-9e9e-4efb-b071-506f0dd66b60-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184846 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/06aa7d53-c968-419c-9782-f89c704e7ebe-kube-api-access-4ln5p\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184929 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.184991 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-sb\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.186212 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-dns-svc\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.186256 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.190973 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.193083 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-scripts\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.194763 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.205151 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.208478 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq9r9\" (UniqueName: \"kubernetes.io/projected/6d7c8212-9e9e-4efb-b071-506f0dd66b60-kube-api-access-tq9r9\") pod \"cinder-scheduler-0\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.280476 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288007 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-nb\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288069 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-config\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288093 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data-custom\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288137 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288166 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/06aa7d53-c968-419c-9782-f89c704e7ebe-kube-api-access-4ln5p\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288194 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fee1ca24-8cd7-45bf-af29-3e1b0b652888-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288356 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-sb\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288554 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee1ca24-8cd7-45bf-af29-3e1b0b652888-logs\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288575 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpzw2\" (UniqueName: \"kubernetes.io/projected/fee1ca24-8cd7-45bf-af29-3e1b0b652888-kube-api-access-wpzw2\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288759 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-scripts\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-dns-svc\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.288874 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-config\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.289427 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-sb\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.289851 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-dns-svc\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.290433 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-nb\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.303550 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/06aa7d53-c968-419c-9782-f89c704e7ebe-kube-api-access-4ln5p\") pod \"dnsmasq-dns-f64d5748f-f9s5c\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.346220 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.390192 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-scripts\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.390470 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data-custom\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.390924 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.390987 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fee1ca24-8cd7-45bf-af29-3e1b0b652888-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.391021 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.391044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee1ca24-8cd7-45bf-af29-3e1b0b652888-logs\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.391060 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpzw2\" (UniqueName: \"kubernetes.io/projected/fee1ca24-8cd7-45bf-af29-3e1b0b652888-kube-api-access-wpzw2\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.391819 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee1ca24-8cd7-45bf-af29-3e1b0b652888-logs\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.391863 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fee1ca24-8cd7-45bf-af29-3e1b0b652888-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.396083 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data-custom\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.396206 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.398293 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-scripts\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.409707 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.414191 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpzw2\" (UniqueName: \"kubernetes.io/projected/fee1ca24-8cd7-45bf-af29-3e1b0b652888-kube-api-access-wpzw2\") pod \"cinder-api-0\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.448894 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:35:02 crc kubenswrapper[4778]: W0930 17:35:02.660707 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06aa7d53_c968_419c_9782_f89c704e7ebe.slice/crio-528a3a86c7476e08868e4844a34eabb25ccaa5e71f4368b28a148261e6658708 WatchSource:0}: Error finding container 528a3a86c7476e08868e4844a34eabb25ccaa5e71f4368b28a148261e6658708: Status 404 returned error can't find the container with id 528a3a86c7476e08868e4844a34eabb25ccaa5e71f4368b28a148261e6658708 Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.678818 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-f9s5c"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.718060 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:02 crc kubenswrapper[4778]: I0930 17:35:02.943063 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:02 crc kubenswrapper[4778]: W0930 17:35:02.969127 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfee1ca24_8cd7_45bf_af29_3e1b0b652888.slice/crio-d15fea9df7ab99ea73e16f94733eb0fd1460d41d960e978b702e883c6cf528da WatchSource:0}: Error finding container d15fea9df7ab99ea73e16f94733eb0fd1460d41d960e978b702e883c6cf528da: Status 404 returned error can't find the container with id d15fea9df7ab99ea73e16f94733eb0fd1460d41d960e978b702e883c6cf528da Sep 30 17:35:03 crc kubenswrapper[4778]: I0930 17:35:03.613581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fee1ca24-8cd7-45bf-af29-3e1b0b652888","Type":"ContainerStarted","Data":"d15fea9df7ab99ea73e16f94733eb0fd1460d41d960e978b702e883c6cf528da"} Sep 30 17:35:03 crc kubenswrapper[4778]: I0930 17:35:03.617155 4778 generic.go:334] "Generic (PLEG): container finished" podID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerID="4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c" exitCode=0 Sep 30 17:35:03 crc kubenswrapper[4778]: I0930 17:35:03.617211 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" event={"ID":"06aa7d53-c968-419c-9782-f89c704e7ebe","Type":"ContainerDied","Data":"4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c"} Sep 30 17:35:03 crc kubenswrapper[4778]: I0930 17:35:03.617231 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" event={"ID":"06aa7d53-c968-419c-9782-f89c704e7ebe","Type":"ContainerStarted","Data":"528a3a86c7476e08868e4844a34eabb25ccaa5e71f4368b28a148261e6658708"} Sep 30 17:35:03 crc kubenswrapper[4778]: I0930 17:35:03.619256 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d7c8212-9e9e-4efb-b071-506f0dd66b60","Type":"ContainerStarted","Data":"05b61441b8a39379abdbe9d0906243af7fb8a5560d12aa0b8bac27dcd8d42759"} Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.198148 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.629968 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fee1ca24-8cd7-45bf-af29-3e1b0b652888","Type":"ContainerStarted","Data":"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4"} Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.630421 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.630449 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fee1ca24-8cd7-45bf-af29-3e1b0b652888","Type":"ContainerStarted","Data":"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea"} Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.630061 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api-log" containerID="cri-o://a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea" gracePeriod=30 Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.630126 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api" containerID="cri-o://9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4" gracePeriod=30 Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.635799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" event={"ID":"06aa7d53-c968-419c-9782-f89c704e7ebe","Type":"ContainerStarted","Data":"c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06"} Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.636161 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.642543 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d7c8212-9e9e-4efb-b071-506f0dd66b60","Type":"ContainerStarted","Data":"4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b"} Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.660590 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.660570001 podStartE2EDuration="2.660570001s" podCreationTimestamp="2025-09-30 17:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:04.654328894 +0000 UTC m=+1043.644226717" watchObservedRunningTime="2025-09-30 17:35:04.660570001 +0000 UTC m=+1043.650467804" Sep 30 17:35:04 crc kubenswrapper[4778]: I0930 17:35:04.671934 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" podStartSLOduration=3.671917549 podStartE2EDuration="3.671917549s" podCreationTimestamp="2025-09-30 17:35:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:04.669574395 +0000 UTC m=+1043.659472198" watchObservedRunningTime="2025-09-30 17:35:04.671917549 +0000 UTC m=+1043.661815342" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.258495 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449049 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fee1ca24-8cd7-45bf-af29-3e1b0b652888-etc-machine-id\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449147 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-combined-ca-bundle\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449171 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data-custom\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449265 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449340 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee1ca24-8cd7-45bf-af29-3e1b0b652888-logs\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449393 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-scripts\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449466 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpzw2\" (UniqueName: \"kubernetes.io/projected/fee1ca24-8cd7-45bf-af29-3e1b0b652888-kube-api-access-wpzw2\") pod \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\" (UID: \"fee1ca24-8cd7-45bf-af29-3e1b0b652888\") " Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.449817 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fee1ca24-8cd7-45bf-af29-3e1b0b652888-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.450219 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee1ca24-8cd7-45bf-af29-3e1b0b652888-logs" (OuterVolumeSpecName: "logs") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.456315 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-scripts" (OuterVolumeSpecName: "scripts") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.457851 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.462046 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fee1ca24-8cd7-45bf-af29-3e1b0b652888-kube-api-access-wpzw2" (OuterVolumeSpecName: "kube-api-access-wpzw2") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "kube-api-access-wpzw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.512633 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.520337 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data" (OuterVolumeSpecName: "config-data") pod "fee1ca24-8cd7-45bf-af29-3e1b0b652888" (UID: "fee1ca24-8cd7-45bf-af29-3e1b0b652888"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.535857 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.134:8443: connect: connection refused" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551603 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551683 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee1ca24-8cd7-45bf-af29-3e1b0b652888-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551695 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551707 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpzw2\" (UniqueName: \"kubernetes.io/projected/fee1ca24-8cd7-45bf-af29-3e1b0b652888-kube-api-access-wpzw2\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551724 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fee1ca24-8cd7-45bf-af29-3e1b0b652888-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551736 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.551748 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fee1ca24-8cd7-45bf-af29-3e1b0b652888-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.651581 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d7c8212-9e9e-4efb-b071-506f0dd66b60","Type":"ContainerStarted","Data":"9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc"} Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.653024 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerID="9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4" exitCode=0 Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.653054 4778 generic.go:334] "Generic (PLEG): container finished" podID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerID="a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea" exitCode=143 Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.653522 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.653969 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fee1ca24-8cd7-45bf-af29-3e1b0b652888","Type":"ContainerDied","Data":"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4"} Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.653999 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fee1ca24-8cd7-45bf-af29-3e1b0b652888","Type":"ContainerDied","Data":"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea"} Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.654013 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fee1ca24-8cd7-45bf-af29-3e1b0b652888","Type":"ContainerDied","Data":"d15fea9df7ab99ea73e16f94733eb0fd1460d41d960e978b702e883c6cf528da"} Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.654031 4778 scope.go:117] "RemoveContainer" containerID="9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.676598 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.018629591 podStartE2EDuration="4.676583395s" podCreationTimestamp="2025-09-30 17:35:01 +0000 UTC" firstStartedPulling="2025-09-30 17:35:02.741816688 +0000 UTC m=+1041.731714491" lastFinishedPulling="2025-09-30 17:35:03.399770492 +0000 UTC m=+1042.389668295" observedRunningTime="2025-09-30 17:35:05.674456998 +0000 UTC m=+1044.664354801" watchObservedRunningTime="2025-09-30 17:35:05.676583395 +0000 UTC m=+1044.666481208" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.680237 4778 scope.go:117] "RemoveContainer" containerID="a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.704585 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.709077 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.724179 4778 scope.go:117] "RemoveContainer" containerID="9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4" Sep 30 17:35:05 crc kubenswrapper[4778]: E0930 17:35:05.727627 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4\": container with ID starting with 9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4 not found: ID does not exist" containerID="9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.727663 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4"} err="failed to get container status \"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4\": rpc error: code = NotFound desc = could not find container \"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4\": container with ID starting with 9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4 not found: ID does not exist" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.727685 4778 scope.go:117] "RemoveContainer" containerID="a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea" Sep 30 17:35:05 crc kubenswrapper[4778]: E0930 17:35:05.727977 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea\": container with ID starting with a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea not found: ID does not exist" containerID="a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.727999 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea"} err="failed to get container status \"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea\": rpc error: code = NotFound desc = could not find container \"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea\": container with ID starting with a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea not found: ID does not exist" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.728012 4778 scope.go:117] "RemoveContainer" containerID="9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.728408 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4"} err="failed to get container status \"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4\": rpc error: code = NotFound desc = could not find container \"9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4\": container with ID starting with 9d260c3e4610359b080faf48ffb7d1670a4a4f40afb606e80b20b0b6f47e50b4 not found: ID does not exist" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.728429 4778 scope.go:117] "RemoveContainer" containerID="a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.728687 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea"} err="failed to get container status \"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea\": rpc error: code = NotFound desc = could not find container \"a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea\": container with ID starting with a989d816726648bc4c2439eeda204e6741b497135b1c8971eb597930110f91ea not found: ID does not exist" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.742387 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" path="/var/lib/kubelet/pods/fee1ca24-8cd7-45bf-af29-3e1b0b652888/volumes" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.742942 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:05 crc kubenswrapper[4778]: E0930 17:35:05.743192 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.743209 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api" Sep 30 17:35:05 crc kubenswrapper[4778]: E0930 17:35:05.743230 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api-log" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.743237 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api-log" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.743388 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.743409 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="fee1ca24-8cd7-45bf-af29-3e1b0b652888" containerName="cinder-api-log" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.744415 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.745576 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.747785 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.748733 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.749038 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859409 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4471ee2f-0551-44ec-8808-8085a962de1f-logs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859471 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859525 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859588 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4471ee2f-0551-44ec-8808-8085a962de1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859635 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-config-data\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859679 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b9jf\" (UniqueName: \"kubernetes.io/projected/4471ee2f-0551-44ec-8808-8085a962de1f-kube-api-access-8b9jf\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859743 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-scripts\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859775 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.859814 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962011 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4471ee2f-0551-44ec-8808-8085a962de1f-logs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962052 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962113 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4471ee2f-0551-44ec-8808-8085a962de1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-config-data\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962153 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b9jf\" (UniqueName: \"kubernetes.io/projected/4471ee2f-0551-44ec-8808-8085a962de1f-kube-api-access-8b9jf\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962191 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-scripts\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962208 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962231 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.962797 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4471ee2f-0551-44ec-8808-8085a962de1f-etc-machine-id\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.963054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4471ee2f-0551-44ec-8808-8085a962de1f-logs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.967228 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-config-data-custom\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.967366 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-scripts\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.968077 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.970977 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.971731 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-public-tls-certs\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.974156 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4471ee2f-0551-44ec-8808-8085a962de1f-config-data\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:05 crc kubenswrapper[4778]: I0930 17:35:05.987036 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b9jf\" (UniqueName: \"kubernetes.io/projected/4471ee2f-0551-44ec-8808-8085a962de1f-kube-api-access-8b9jf\") pod \"cinder-api-0\" (UID: \"4471ee2f-0551-44ec-8808-8085a962de1f\") " pod="openstack/cinder-api-0" Sep 30 17:35:06 crc kubenswrapper[4778]: I0930 17:35:06.073139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 17:35:06 crc kubenswrapper[4778]: I0930 17:35:06.621410 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 17:35:06 crc kubenswrapper[4778]: I0930 17:35:06.668015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4471ee2f-0551-44ec-8808-8085a962de1f","Type":"ContainerStarted","Data":"d07773bd5b1380bab588bd5aad9e55caa2cab2380ea1ab55715eec3e05fe3c56"} Sep 30 17:35:07 crc kubenswrapper[4778]: I0930 17:35:07.280799 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 17:35:07 crc kubenswrapper[4778]: I0930 17:35:07.680177 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4471ee2f-0551-44ec-8808-8085a962de1f","Type":"ContainerStarted","Data":"0c4cb48b770acd1af8edebf03e30289a865f2f1b9dfaa3e9781070e7dfda6eca"} Sep 30 17:35:08 crc kubenswrapper[4778]: I0930 17:35:08.693135 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"4471ee2f-0551-44ec-8808-8085a962de1f","Type":"ContainerStarted","Data":"91c115d287f41c6abc4fc71a48673deff26c0f11d6dea2b3203e876ecf522a42"} Sep 30 17:35:08 crc kubenswrapper[4778]: I0930 17:35:08.693566 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 17:35:08 crc kubenswrapper[4778]: I0930 17:35:08.718825 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.718809274 podStartE2EDuration="3.718809274s" podCreationTimestamp="2025-09-30 17:35:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:08.713759405 +0000 UTC m=+1047.703657208" watchObservedRunningTime="2025-09-30 17:35:08.718809274 +0000 UTC m=+1047.708707077" Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.185692 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-56c95856-xw6pr" Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.351984 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.437155 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-wgw8f"] Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.437391 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerName="dnsmasq-dns" containerID="cri-o://c58eeacc1efa12e3f03e7c69fc7e4db1e7e76bfd5ab48878df0e3d0064a25639" gracePeriod=10 Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.611137 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.643655 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.730253 4778 generic.go:334] "Generic (PLEG): container finished" podID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerID="c58eeacc1efa12e3f03e7c69fc7e4db1e7e76bfd5ab48878df0e3d0064a25639" exitCode=0 Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.730315 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" event={"ID":"c316357a-faa3-4a48-9e86-0f9138b01f76","Type":"ContainerDied","Data":"c58eeacc1efa12e3f03e7c69fc7e4db1e7e76bfd5ab48878df0e3d0064a25639"} Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.730457 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="cinder-scheduler" containerID="cri-o://4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b" gracePeriod=30 Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.730497 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="probe" containerID="cri-o://9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc" gracePeriod=30 Sep 30 17:35:12 crc kubenswrapper[4778]: I0930 17:35:12.928928 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.092876 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-sb\") pod \"c316357a-faa3-4a48-9e86-0f9138b01f76\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.093034 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-nb\") pod \"c316357a-faa3-4a48-9e86-0f9138b01f76\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.093095 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j48j5\" (UniqueName: \"kubernetes.io/projected/c316357a-faa3-4a48-9e86-0f9138b01f76-kube-api-access-j48j5\") pod \"c316357a-faa3-4a48-9e86-0f9138b01f76\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.093157 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-config\") pod \"c316357a-faa3-4a48-9e86-0f9138b01f76\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.093217 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-dns-svc\") pod \"c316357a-faa3-4a48-9e86-0f9138b01f76\" (UID: \"c316357a-faa3-4a48-9e86-0f9138b01f76\") " Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.100768 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c316357a-faa3-4a48-9e86-0f9138b01f76-kube-api-access-j48j5" (OuterVolumeSpecName: "kube-api-access-j48j5") pod "c316357a-faa3-4a48-9e86-0f9138b01f76" (UID: "c316357a-faa3-4a48-9e86-0f9138b01f76"). InnerVolumeSpecName "kube-api-access-j48j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.145974 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c316357a-faa3-4a48-9e86-0f9138b01f76" (UID: "c316357a-faa3-4a48-9e86-0f9138b01f76"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.166851 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c316357a-faa3-4a48-9e86-0f9138b01f76" (UID: "c316357a-faa3-4a48-9e86-0f9138b01f76"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.168144 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c316357a-faa3-4a48-9e86-0f9138b01f76" (UID: "c316357a-faa3-4a48-9e86-0f9138b01f76"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.173147 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-config" (OuterVolumeSpecName: "config") pod "c316357a-faa3-4a48-9e86-0f9138b01f76" (UID: "c316357a-faa3-4a48-9e86-0f9138b01f76"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.195854 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.195916 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j48j5\" (UniqueName: \"kubernetes.io/projected/c316357a-faa3-4a48-9e86-0f9138b01f76-kube-api-access-j48j5\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.195933 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.195997 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.196010 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c316357a-faa3-4a48-9e86-0f9138b01f76-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.759421 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerID="9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc" exitCode=0 Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.759505 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d7c8212-9e9e-4efb-b071-506f0dd66b60","Type":"ContainerDied","Data":"9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc"} Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.762036 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" event={"ID":"c316357a-faa3-4a48-9e86-0f9138b01f76","Type":"ContainerDied","Data":"bf4f74b6a156a9834ea41fcb8ec84b7b41e0e34e1d8a8fc7b131c582254c3f33"} Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.762100 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-wgw8f" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.762135 4778 scope.go:117] "RemoveContainer" containerID="c58eeacc1efa12e3f03e7c69fc7e4db1e7e76bfd5ab48878df0e3d0064a25639" Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.793502 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-wgw8f"] Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.798717 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-wgw8f"] Sep 30 17:35:13 crc kubenswrapper[4778]: I0930 17:35:13.800703 4778 scope.go:117] "RemoveContainer" containerID="c10362c784a5abc55288b5514b67d8c30d1ac8317acb8b3c292ba47bff60203a" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.094174 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 17:35:14 crc kubenswrapper[4778]: E0930 17:35:14.094849 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerName="init" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.094957 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerName="init" Sep 30 17:35:14 crc kubenswrapper[4778]: E0930 17:35:14.095085 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerName="dnsmasq-dns" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.095171 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerName="dnsmasq-dns" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.095440 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" containerName="dnsmasq-dns" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.096268 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.099375 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.099786 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-6dnq8" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.099893 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.121144 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.217020 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd1cc25-8254-44e2-b64a-1711eed0609e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.217078 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9bd1cc25-8254-44e2-b64a-1711eed0609e-openstack-config-secret\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.217101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9bd1cc25-8254-44e2-b64a-1711eed0609e-openstack-config\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.217159 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcdbq\" (UniqueName: \"kubernetes.io/projected/9bd1cc25-8254-44e2-b64a-1711eed0609e-kube-api-access-qcdbq\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.319405 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd1cc25-8254-44e2-b64a-1711eed0609e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.319522 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9bd1cc25-8254-44e2-b64a-1711eed0609e-openstack-config-secret\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.319595 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9bd1cc25-8254-44e2-b64a-1711eed0609e-openstack-config\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.319771 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcdbq\" (UniqueName: \"kubernetes.io/projected/9bd1cc25-8254-44e2-b64a-1711eed0609e-kube-api-access-qcdbq\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.320457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9bd1cc25-8254-44e2-b64a-1711eed0609e-openstack-config\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.327370 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9bd1cc25-8254-44e2-b64a-1711eed0609e-openstack-config-secret\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.327549 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bd1cc25-8254-44e2-b64a-1711eed0609e-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.339994 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcdbq\" (UniqueName: \"kubernetes.io/projected/9bd1cc25-8254-44e2-b64a-1711eed0609e-kube-api-access-qcdbq\") pod \"openstackclient\" (UID: \"9bd1cc25-8254-44e2-b64a-1711eed0609e\") " pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.433012 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.812228 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.812278 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.812316 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.813018 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"97044f3dbaff452261d88827459c4da3b10678dd945a7792a4a92fff2dc6be50"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.813080 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://97044f3dbaff452261d88827459c4da3b10678dd945a7792a4a92fff2dc6be50" gracePeriod=600 Sep 30 17:35:14 crc kubenswrapper[4778]: I0930 17:35:14.926365 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.533217 4778 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-67d9465564-vcjl8" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.134:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.134:8443: connect: connection refused" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.536691 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.642292 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq9r9\" (UniqueName: \"kubernetes.io/projected/6d7c8212-9e9e-4efb-b071-506f0dd66b60-kube-api-access-tq9r9\") pod \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.642796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data\") pod \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.642993 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c8212-9e9e-4efb-b071-506f0dd66b60-etc-machine-id\") pod \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.643042 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-scripts\") pod \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.643083 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-combined-ca-bundle\") pod \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.643117 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data-custom\") pod \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\" (UID: \"6d7c8212-9e9e-4efb-b071-506f0dd66b60\") " Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.643099 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d7c8212-9e9e-4efb-b071-506f0dd66b60-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6d7c8212-9e9e-4efb-b071-506f0dd66b60" (UID: "6d7c8212-9e9e-4efb-b071-506f0dd66b60"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.664835 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6d7c8212-9e9e-4efb-b071-506f0dd66b60" (UID: "6d7c8212-9e9e-4efb-b071-506f0dd66b60"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.664916 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-scripts" (OuterVolumeSpecName: "scripts") pod "6d7c8212-9e9e-4efb-b071-506f0dd66b60" (UID: "6d7c8212-9e9e-4efb-b071-506f0dd66b60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.665013 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d7c8212-9e9e-4efb-b071-506f0dd66b60-kube-api-access-tq9r9" (OuterVolumeSpecName: "kube-api-access-tq9r9") pod "6d7c8212-9e9e-4efb-b071-506f0dd66b60" (UID: "6d7c8212-9e9e-4efb-b071-506f0dd66b60"). InnerVolumeSpecName "kube-api-access-tq9r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.700216 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d7c8212-9e9e-4efb-b071-506f0dd66b60" (UID: "6d7c8212-9e9e-4efb-b071-506f0dd66b60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.725997 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c316357a-faa3-4a48-9e86-0f9138b01f76" path="/var/lib/kubelet/pods/c316357a-faa3-4a48-9e86-0f9138b01f76/volumes" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.747911 4778 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d7c8212-9e9e-4efb-b071-506f0dd66b60-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.747942 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.747954 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.747965 4778 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.747976 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq9r9\" (UniqueName: \"kubernetes.io/projected/6d7c8212-9e9e-4efb-b071-506f0dd66b60-kube-api-access-tq9r9\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.810758 4778 generic.go:334] "Generic (PLEG): container finished" podID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerID="4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b" exitCode=0 Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.811061 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d7c8212-9e9e-4efb-b071-506f0dd66b60","Type":"ContainerDied","Data":"4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b"} Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.811162 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d7c8212-9e9e-4efb-b071-506f0dd66b60","Type":"ContainerDied","Data":"05b61441b8a39379abdbe9d0906243af7fb8a5560d12aa0b8bac27dcd8d42759"} Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.812532 4778 scope.go:117] "RemoveContainer" containerID="9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.812599 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.822075 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9bd1cc25-8254-44e2-b64a-1711eed0609e","Type":"ContainerStarted","Data":"676c37a2933ac99389784b6b0bf501d53701c687a455179e12361425a3f28bd9"} Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.826068 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="97044f3dbaff452261d88827459c4da3b10678dd945a7792a4a92fff2dc6be50" exitCode=0 Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.826188 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"97044f3dbaff452261d88827459c4da3b10678dd945a7792a4a92fff2dc6be50"} Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.826258 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"326830dff025b3a623c0b61fcab0be0c2c9f34db066bfe20235b1b092a5d8935"} Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.855690 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data" (OuterVolumeSpecName: "config-data") pod "6d7c8212-9e9e-4efb-b071-506f0dd66b60" (UID: "6d7c8212-9e9e-4efb-b071-506f0dd66b60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.879005 4778 scope.go:117] "RemoveContainer" containerID="4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.902180 4778 scope.go:117] "RemoveContainer" containerID="9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc" Sep 30 17:35:15 crc kubenswrapper[4778]: E0930 17:35:15.902907 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc\": container with ID starting with 9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc not found: ID does not exist" containerID="9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.902961 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc"} err="failed to get container status \"9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc\": rpc error: code = NotFound desc = could not find container \"9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc\": container with ID starting with 9aecdc060e54904aaaa3524e7a3cb64a9c32351191a67a8711064165b64d23dc not found: ID does not exist" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.902999 4778 scope.go:117] "RemoveContainer" containerID="4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b" Sep 30 17:35:15 crc kubenswrapper[4778]: E0930 17:35:15.903483 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b\": container with ID starting with 4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b not found: ID does not exist" containerID="4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.903519 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b"} err="failed to get container status \"4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b\": rpc error: code = NotFound desc = could not find container \"4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b\": container with ID starting with 4e1fd1713d3777c4e4f439731cf5d0965f3a4eaa9d8d35e6345deaedf231492b not found: ID does not exist" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.903536 4778 scope.go:117] "RemoveContainer" containerID="7e8e94b4bfe4e71036adf0980559cf4826c826e3aed3c9f8f0d61aee964cec6e" Sep 30 17:35:15 crc kubenswrapper[4778]: I0930 17:35:15.951810 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d7c8212-9e9e-4efb-b071-506f0dd66b60-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.145331 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.159610 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.173746 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:16 crc kubenswrapper[4778]: E0930 17:35:16.174083 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="cinder-scheduler" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.174099 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="cinder-scheduler" Sep 30 17:35:16 crc kubenswrapper[4778]: E0930 17:35:16.174111 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="probe" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.174118 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="probe" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.174277 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="cinder-scheduler" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.174296 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" containerName="probe" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.175122 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.177954 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.180066 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.255559 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.255705 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz8dj\" (UniqueName: \"kubernetes.io/projected/e3515e8e-89f6-4a7d-ab60-53bebbc77315-kube-api-access-vz8dj\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.255776 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.255821 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.255859 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.256049 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3515e8e-89f6-4a7d-ab60-53bebbc77315-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358332 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz8dj\" (UniqueName: \"kubernetes.io/projected/e3515e8e-89f6-4a7d-ab60-53bebbc77315-kube-api-access-vz8dj\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358467 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358530 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358565 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358666 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3515e8e-89f6-4a7d-ab60-53bebbc77315-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358765 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.358840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3515e8e-89f6-4a7d-ab60-53bebbc77315-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.363707 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.363898 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.367424 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.373694 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz8dj\" (UniqueName: \"kubernetes.io/projected/e3515e8e-89f6-4a7d-ab60-53bebbc77315-kube-api-access-vz8dj\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.374239 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3515e8e-89f6-4a7d-ab60-53bebbc77315-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3515e8e-89f6-4a7d-ab60-53bebbc77315\") " pod="openstack/cinder-scheduler-0" Sep 30 17:35:16 crc kubenswrapper[4778]: I0930 17:35:16.499226 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.001949 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.192903 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.724562 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d7c8212-9e9e-4efb-b071-506f0dd66b60" path="/var/lib/kubelet/pods/6d7c8212-9e9e-4efb-b071-506f0dd66b60/volumes" Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.743494 4778 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3e093cbb-cd23-4a15-8159-fa514c992d60"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3e093cbb-cd23-4a15-8159-fa514c992d60] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3e093cbb_cd23_4a15_8159_fa514c992d60.slice" Sep 30 17:35:17 crc kubenswrapper[4778]: E0930 17:35:17.743542 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3e093cbb-cd23-4a15-8159-fa514c992d60] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3e093cbb-cd23-4a15-8159-fa514c992d60] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3e093cbb_cd23_4a15_8159_fa514c992d60.slice" pod="openstack/horizon-5cdc899d97-c5xnr" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.851982 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5cdc899d97-c5xnr" Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.852045 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3515e8e-89f6-4a7d-ab60-53bebbc77315","Type":"ContainerStarted","Data":"6e0419ae65c6536eabc227102d334c1ada812301c233a4b6082ef04c7a007561"} Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.852086 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3515e8e-89f6-4a7d-ab60-53bebbc77315","Type":"ContainerStarted","Data":"bf93496a76ac8706e51f0fa1e7f49c6b0301ce4d2777d54701f463f85ee220c1"} Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.880412 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5cdc899d97-c5xnr"] Sep 30 17:35:17 crc kubenswrapper[4778]: I0930 17:35:17.887521 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5cdc899d97-c5xnr"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.041463 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.562438 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-d9t4c"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.564041 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.575300 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-d9t4c"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.626248 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz9w4\" (UniqueName: \"kubernetes.io/projected/7a06f678-ac41-4482-8889-c85b2beb6703-kube-api-access-qz9w4\") pod \"nova-api-db-create-d9t4c\" (UID: \"7a06f678-ac41-4482-8889-c85b2beb6703\") " pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.668303 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7qsbn"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.675540 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.678074 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7qsbn"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.730184 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbfk8\" (UniqueName: \"kubernetes.io/projected/2ee6faf7-113e-4545-b3d5-e29e611c4f3e-kube-api-access-cbfk8\") pod \"nova-cell0-db-create-7qsbn\" (UID: \"2ee6faf7-113e-4545-b3d5-e29e611c4f3e\") " pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.730280 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz9w4\" (UniqueName: \"kubernetes.io/projected/7a06f678-ac41-4482-8889-c85b2beb6703-kube-api-access-qz9w4\") pod \"nova-api-db-create-d9t4c\" (UID: \"7a06f678-ac41-4482-8889-c85b2beb6703\") " pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.761076 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz9w4\" (UniqueName: \"kubernetes.io/projected/7a06f678-ac41-4482-8889-c85b2beb6703-kube-api-access-qz9w4\") pod \"nova-api-db-create-d9t4c\" (UID: \"7a06f678-ac41-4482-8889-c85b2beb6703\") " pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.782278 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-26rxw"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.783302 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.812885 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-26rxw"] Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.832406 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vwrr\" (UniqueName: \"kubernetes.io/projected/65194693-743e-4b92-9bb3-3a9e0ea49273-kube-api-access-7vwrr\") pod \"nova-cell1-db-create-26rxw\" (UID: \"65194693-743e-4b92-9bb3-3a9e0ea49273\") " pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.832521 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbfk8\" (UniqueName: \"kubernetes.io/projected/2ee6faf7-113e-4545-b3d5-e29e611c4f3e-kube-api-access-cbfk8\") pod \"nova-cell0-db-create-7qsbn\" (UID: \"2ee6faf7-113e-4545-b3d5-e29e611c4f3e\") " pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.857273 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbfk8\" (UniqueName: \"kubernetes.io/projected/2ee6faf7-113e-4545-b3d5-e29e611c4f3e-kube-api-access-cbfk8\") pod \"nova-cell0-db-create-7qsbn\" (UID: \"2ee6faf7-113e-4545-b3d5-e29e611c4f3e\") " pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.866017 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3515e8e-89f6-4a7d-ab60-53bebbc77315","Type":"ContainerStarted","Data":"a887b8d59371b9aebded9e9804e06cebb59725f7adaffe35557d672c3aa33b10"} Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.881273 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.890174 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.8901591079999998 podStartE2EDuration="2.890159108s" podCreationTimestamp="2025-09-30 17:35:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:18.885321116 +0000 UTC m=+1057.875218919" watchObservedRunningTime="2025-09-30 17:35:18.890159108 +0000 UTC m=+1057.880056901" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.934176 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vwrr\" (UniqueName: \"kubernetes.io/projected/65194693-743e-4b92-9bb3-3a9e0ea49273-kube-api-access-7vwrr\") pod \"nova-cell1-db-create-26rxw\" (UID: \"65194693-743e-4b92-9bb3-3a9e0ea49273\") " pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.953658 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vwrr\" (UniqueName: \"kubernetes.io/projected/65194693-743e-4b92-9bb3-3a9e0ea49273-kube-api-access-7vwrr\") pod \"nova-cell1-db-create-26rxw\" (UID: \"65194693-743e-4b92-9bb3-3a9e0ea49273\") " pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:18 crc kubenswrapper[4778]: I0930 17:35:18.988597 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.130806 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.387712 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-d9t4c"] Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.516881 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7qsbn"] Sep 30 17:35:19 crc kubenswrapper[4778]: W0930 17:35:19.524402 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee6faf7_113e_4545_b3d5_e29e611c4f3e.slice/crio-0c48535ab91c8ddd00e5a567a2f88a7bcb54f14d1ff7823b55a25f08b2235be8 WatchSource:0}: Error finding container 0c48535ab91c8ddd00e5a567a2f88a7bcb54f14d1ff7823b55a25f08b2235be8: Status 404 returned error can't find the container with id 0c48535ab91c8ddd00e5a567a2f88a7bcb54f14d1ff7823b55a25f08b2235be8 Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.680541 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-26rxw"] Sep 30 17:35:19 crc kubenswrapper[4778]: W0930 17:35:19.688429 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65194693_743e_4b92_9bb3_3a9e0ea49273.slice/crio-e189183e9652f21c78db1afc213a0e00f1bb41e27b24c1fb583f7d0fc4b4f4d0 WatchSource:0}: Error finding container e189183e9652f21c78db1afc213a0e00f1bb41e27b24c1fb583f7d0fc4b4f4d0: Status 404 returned error can't find the container with id e189183e9652f21c78db1afc213a0e00f1bb41e27b24c1fb583f7d0fc4b4f4d0 Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.725900 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e093cbb-cd23-4a15-8159-fa514c992d60" path="/var/lib/kubelet/pods/3e093cbb-cd23-4a15-8159-fa514c992d60/volumes" Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.876707 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26rxw" event={"ID":"65194693-743e-4b92-9bb3-3a9e0ea49273","Type":"ContainerStarted","Data":"e189183e9652f21c78db1afc213a0e00f1bb41e27b24c1fb583f7d0fc4b4f4d0"} Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.878100 4778 generic.go:334] "Generic (PLEG): container finished" podID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerID="f46ea7bb14934f472ac14ea511b321663dabc3b03e06c1ab29ba50f82a78b301" exitCode=137 Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.878167 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d9465564-vcjl8" event={"ID":"b48eff09-cd27-48b0-9633-0fd28d43e0a0","Type":"ContainerDied","Data":"f46ea7bb14934f472ac14ea511b321663dabc3b03e06c1ab29ba50f82a78b301"} Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.879299 4778 generic.go:334] "Generic (PLEG): container finished" podID="2ee6faf7-113e-4545-b3d5-e29e611c4f3e" containerID="b48cedf96a808739d2166f3d99b58dae88e42ebd7548a34252efb5d4f341ee35" exitCode=0 Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.879333 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7qsbn" event={"ID":"2ee6faf7-113e-4545-b3d5-e29e611c4f3e","Type":"ContainerDied","Data":"b48cedf96a808739d2166f3d99b58dae88e42ebd7548a34252efb5d4f341ee35"} Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.879387 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7qsbn" event={"ID":"2ee6faf7-113e-4545-b3d5-e29e611c4f3e","Type":"ContainerStarted","Data":"0c48535ab91c8ddd00e5a567a2f88a7bcb54f14d1ff7823b55a25f08b2235be8"} Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.880761 4778 generic.go:334] "Generic (PLEG): container finished" podID="7a06f678-ac41-4482-8889-c85b2beb6703" containerID="557d0694d8384b4f06f8f9a6206ea4b6f52ba33153491a5e3568b9fa555d9868" exitCode=0 Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.882096 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-d9t4c" event={"ID":"7a06f678-ac41-4482-8889-c85b2beb6703","Type":"ContainerDied","Data":"557d0694d8384b4f06f8f9a6206ea4b6f52ba33153491a5e3568b9fa555d9868"} Sep 30 17:35:19 crc kubenswrapper[4778]: I0930 17:35:19.882123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-d9t4c" event={"ID":"7a06f678-ac41-4482-8889-c85b2beb6703","Type":"ContainerStarted","Data":"6a796eb311f328591ace16d978346122c37697d257680babe5555d44fd48d198"} Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.025024 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-98b979bf7-grpws" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.114059 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dcd84db9b-hvcqd"] Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.114378 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6dcd84db9b-hvcqd" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-api" containerID="cri-o://fa3179aa1770f2ea120d9f140816cf257ced5072452c736c4961c586d309688c" gracePeriod=30 Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.114432 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6dcd84db9b-hvcqd" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-httpd" containerID="cri-o://7e4454346fd4b2af1658fd800cf2d4d1bd2bf7f514806f9c012b9b4014efe4d7" gracePeriod=30 Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.188139 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360101 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-tls-certs\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360138 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-combined-ca-bundle\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360166 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48eff09-cd27-48b0-9633-0fd28d43e0a0-logs\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360182 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-secret-key\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360197 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-scripts\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360259 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-config-data\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360313 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb5tj\" (UniqueName: \"kubernetes.io/projected/b48eff09-cd27-48b0-9633-0fd28d43e0a0-kube-api-access-hb5tj\") pod \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\" (UID: \"b48eff09-cd27-48b0-9633-0fd28d43e0a0\") " Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.360870 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b48eff09-cd27-48b0-9633-0fd28d43e0a0-logs" (OuterVolumeSpecName: "logs") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.366652 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b48eff09-cd27-48b0-9633-0fd28d43e0a0-kube-api-access-hb5tj" (OuterVolumeSpecName: "kube-api-access-hb5tj") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "kube-api-access-hb5tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.380838 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.383779 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-scripts" (OuterVolumeSpecName: "scripts") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.383867 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-config-data" (OuterVolumeSpecName: "config-data") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.394772 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.415088 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "b48eff09-cd27-48b0-9633-0fd28d43e0a0" (UID: "b48eff09-cd27-48b0-9633-0fd28d43e0a0"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462640 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462679 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb5tj\" (UniqueName: \"kubernetes.io/projected/b48eff09-cd27-48b0-9633-0fd28d43e0a0-kube-api-access-hb5tj\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462692 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462701 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462710 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b48eff09-cd27-48b0-9633-0fd28d43e0a0-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462718 4778 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b48eff09-cd27-48b0-9633-0fd28d43e0a0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.462727 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b48eff09-cd27-48b0-9633-0fd28d43e0a0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.892715 4778 generic.go:334] "Generic (PLEG): container finished" podID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerID="7e4454346fd4b2af1658fd800cf2d4d1bd2bf7f514806f9c012b9b4014efe4d7" exitCode=0 Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.892921 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcd84db9b-hvcqd" event={"ID":"aaa8a326-cf9a-4788-9e58-7c8bfbe92962","Type":"ContainerDied","Data":"7e4454346fd4b2af1658fd800cf2d4d1bd2bf7f514806f9c012b9b4014efe4d7"} Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.895138 4778 generic.go:334] "Generic (PLEG): container finished" podID="65194693-743e-4b92-9bb3-3a9e0ea49273" containerID="eec7da21c92f3dcc0bde7a14596a5ddf1b71d8679514e4c0ffa4b79eb6ad9971" exitCode=0 Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.895200 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26rxw" event={"ID":"65194693-743e-4b92-9bb3-3a9e0ea49273","Type":"ContainerDied","Data":"eec7da21c92f3dcc0bde7a14596a5ddf1b71d8679514e4c0ffa4b79eb6ad9971"} Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.904178 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67d9465564-vcjl8" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.904226 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67d9465564-vcjl8" event={"ID":"b48eff09-cd27-48b0-9633-0fd28d43e0a0","Type":"ContainerDied","Data":"ad6ee72acbe69126b6461218bc3d0444dd96a1d9ce67fca6c94b44df90a08d83"} Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.904334 4778 scope.go:117] "RemoveContainer" containerID="e621921148275d7f14a235d97f55f5194ed9f470a27069ec84c67cab8d3f9b84" Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.942308 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67d9465564-vcjl8"] Sep 30 17:35:20 crc kubenswrapper[4778]: I0930 17:35:20.953392 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67d9465564-vcjl8"] Sep 30 17:35:21 crc kubenswrapper[4778]: I0930 17:35:21.500688 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 17:35:21 crc kubenswrapper[4778]: I0930 17:35:21.729289 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" path="/var/lib/kubelet/pods/b48eff09-cd27-48b0-9633-0fd28d43e0a0/volumes" Sep 30 17:35:24 crc kubenswrapper[4778]: I0930 17:35:24.944490 4778 generic.go:334] "Generic (PLEG): container finished" podID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerID="fa3179aa1770f2ea120d9f140816cf257ced5072452c736c4961c586d309688c" exitCode=0 Sep 30 17:35:24 crc kubenswrapper[4778]: I0930 17:35:24.944557 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcd84db9b-hvcqd" event={"ID":"aaa8a326-cf9a-4788-9e58-7c8bfbe92962","Type":"ContainerDied","Data":"fa3179aa1770f2ea120d9f140816cf257ced5072452c736c4961c586d309688c"} Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.476317 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.482331 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.490704 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.639941 4778 scope.go:117] "RemoveContainer" containerID="f46ea7bb14934f472ac14ea511b321663dabc3b03e06c1ab29ba50f82a78b301" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.658924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbfk8\" (UniqueName: \"kubernetes.io/projected/2ee6faf7-113e-4545-b3d5-e29e611c4f3e-kube-api-access-cbfk8\") pod \"2ee6faf7-113e-4545-b3d5-e29e611c4f3e\" (UID: \"2ee6faf7-113e-4545-b3d5-e29e611c4f3e\") " Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.659033 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz9w4\" (UniqueName: \"kubernetes.io/projected/7a06f678-ac41-4482-8889-c85b2beb6703-kube-api-access-qz9w4\") pod \"7a06f678-ac41-4482-8889-c85b2beb6703\" (UID: \"7a06f678-ac41-4482-8889-c85b2beb6703\") " Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.659429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vwrr\" (UniqueName: \"kubernetes.io/projected/65194693-743e-4b92-9bb3-3a9e0ea49273-kube-api-access-7vwrr\") pod \"65194693-743e-4b92-9bb3-3a9e0ea49273\" (UID: \"65194693-743e-4b92-9bb3-3a9e0ea49273\") " Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.666211 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65194693-743e-4b92-9bb3-3a9e0ea49273-kube-api-access-7vwrr" (OuterVolumeSpecName: "kube-api-access-7vwrr") pod "65194693-743e-4b92-9bb3-3a9e0ea49273" (UID: "65194693-743e-4b92-9bb3-3a9e0ea49273"). InnerVolumeSpecName "kube-api-access-7vwrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.676500 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a06f678-ac41-4482-8889-c85b2beb6703-kube-api-access-qz9w4" (OuterVolumeSpecName: "kube-api-access-qz9w4") pod "7a06f678-ac41-4482-8889-c85b2beb6703" (UID: "7a06f678-ac41-4482-8889-c85b2beb6703"). InnerVolumeSpecName "kube-api-access-qz9w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.694267 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ee6faf7-113e-4545-b3d5-e29e611c4f3e-kube-api-access-cbfk8" (OuterVolumeSpecName: "kube-api-access-cbfk8") pod "2ee6faf7-113e-4545-b3d5-e29e611c4f3e" (UID: "2ee6faf7-113e-4545-b3d5-e29e611c4f3e"). InnerVolumeSpecName "kube-api-access-cbfk8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.761280 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbfk8\" (UniqueName: \"kubernetes.io/projected/2ee6faf7-113e-4545-b3d5-e29e611c4f3e-kube-api-access-cbfk8\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.761311 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz9w4\" (UniqueName: \"kubernetes.io/projected/7a06f678-ac41-4482-8889-c85b2beb6703-kube-api-access-qz9w4\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.761321 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vwrr\" (UniqueName: \"kubernetes.io/projected/65194693-743e-4b92-9bb3-3a9e0ea49273-kube-api-access-7vwrr\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.893951 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.966391 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7qsbn" event={"ID":"2ee6faf7-113e-4545-b3d5-e29e611c4f3e","Type":"ContainerDied","Data":"0c48535ab91c8ddd00e5a567a2f88a7bcb54f14d1ff7823b55a25f08b2235be8"} Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.966706 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c48535ab91c8ddd00e5a567a2f88a7bcb54f14d1ff7823b55a25f08b2235be8" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.966430 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7qsbn" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.968777 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dcd84db9b-hvcqd" event={"ID":"aaa8a326-cf9a-4788-9e58-7c8bfbe92962","Type":"ContainerDied","Data":"4c7608b9ac4ec80e9c038e49f44d3ecb2a0492c4d771d350537eab7565f63b78"} Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.968806 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dcd84db9b-hvcqd" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.968814 4778 scope.go:117] "RemoveContainer" containerID="7e4454346fd4b2af1658fd800cf2d4d1bd2bf7f514806f9c012b9b4014efe4d7" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.970344 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-d9t4c" event={"ID":"7a06f678-ac41-4482-8889-c85b2beb6703","Type":"ContainerDied","Data":"6a796eb311f328591ace16d978346122c37697d257680babe5555d44fd48d198"} Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.970360 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a796eb311f328591ace16d978346122c37697d257680babe5555d44fd48d198" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.970401 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-d9t4c" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.979882 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26rxw" event={"ID":"65194693-743e-4b92-9bb3-3a9e0ea49273","Type":"ContainerDied","Data":"e189183e9652f21c78db1afc213a0e00f1bb41e27b24c1fb583f7d0fc4b4f4d0"} Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.979914 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26rxw" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.979927 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e189183e9652f21c78db1afc213a0e00f1bb41e27b24c1fb583f7d0fc4b4f4d0" Sep 30 17:35:25 crc kubenswrapper[4778]: I0930 17:35:25.992500 4778 scope.go:117] "RemoveContainer" containerID="fa3179aa1770f2ea120d9f140816cf257ced5072452c736c4961c586d309688c" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.065102 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-httpd-config\") pod \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.065279 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-combined-ca-bundle\") pod \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.065308 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-ovndb-tls-certs\") pod \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.065362 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-config\") pod \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.065385 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5kvm\" (UniqueName: \"kubernetes.io/projected/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-kube-api-access-c5kvm\") pod \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\" (UID: \"aaa8a326-cf9a-4788-9e58-7c8bfbe92962\") " Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.076105 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "aaa8a326-cf9a-4788-9e58-7c8bfbe92962" (UID: "aaa8a326-cf9a-4788-9e58-7c8bfbe92962"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.092864 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-kube-api-access-c5kvm" (OuterVolumeSpecName: "kube-api-access-c5kvm") pod "aaa8a326-cf9a-4788-9e58-7c8bfbe92962" (UID: "aaa8a326-cf9a-4788-9e58-7c8bfbe92962"). InnerVolumeSpecName "kube-api-access-c5kvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.154980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aaa8a326-cf9a-4788-9e58-7c8bfbe92962" (UID: "aaa8a326-cf9a-4788-9e58-7c8bfbe92962"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.167045 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5kvm\" (UniqueName: \"kubernetes.io/projected/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-kube-api-access-c5kvm\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.167080 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.167090 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.173808 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "aaa8a326-cf9a-4788-9e58-7c8bfbe92962" (UID: "aaa8a326-cf9a-4788-9e58-7c8bfbe92962"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.205821 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-config" (OuterVolumeSpecName: "config") pod "aaa8a326-cf9a-4788-9e58-7c8bfbe92962" (UID: "aaa8a326-cf9a-4788-9e58-7c8bfbe92962"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.269110 4778 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.269144 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/aaa8a326-cf9a-4788-9e58-7c8bfbe92962-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.296190 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6dcd84db9b-hvcqd"] Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.302295 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6dcd84db9b-hvcqd"] Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.731768 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 17:35:26 crc kubenswrapper[4778]: I0930 17:35:26.993030 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"9bd1cc25-8254-44e2-b64a-1711eed0609e","Type":"ContainerStarted","Data":"baf9f08aefca05828edfa3fdd96b8781f942df197e2bbc4273728e058557f902"} Sep 30 17:35:27 crc kubenswrapper[4778]: I0930 17:35:27.019379 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.256439621 podStartE2EDuration="13.019357705s" podCreationTimestamp="2025-09-30 17:35:14 +0000 UTC" firstStartedPulling="2025-09-30 17:35:14.930061283 +0000 UTC m=+1053.919959126" lastFinishedPulling="2025-09-30 17:35:25.692979387 +0000 UTC m=+1064.682877210" observedRunningTime="2025-09-30 17:35:27.013133099 +0000 UTC m=+1066.003030922" watchObservedRunningTime="2025-09-30 17:35:27.019357705 +0000 UTC m=+1066.009255538" Sep 30 17:35:27 crc kubenswrapper[4778]: I0930 17:35:27.729373 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" path="/var/lib/kubelet/pods/aaa8a326-cf9a-4788-9e58-7c8bfbe92962/volumes" Sep 30 17:35:27 crc kubenswrapper[4778]: I0930 17:35:27.982061 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:35:27 crc kubenswrapper[4778]: I0930 17:35:27.982398 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-log" containerID="cri-o://f78dc41c523244de92dbd3a1259d3cb4401e0a812fba5d6d8377e1001a06b0e7" gracePeriod=30 Sep 30 17:35:27 crc kubenswrapper[4778]: I0930 17:35:27.982488 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-httpd" containerID="cri-o://14a58edb9355c8a4b88820e76d44c0d225196d8d2e1c0e6a91d9a8ae7b960902" gracePeriod=30 Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.712372 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.712815 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-httpd" containerID="cri-o://b5a435f83dc96d7d29c1d79b339d933b9c293bfdfabdfef8845954ecf9e7fdf4" gracePeriod=30 Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.714046 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-log" containerID="cri-o://d73ad7a8c5107070fcea7006bd4439a8f4e64b8854a15902161f47775ee98fd7" gracePeriod=30 Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.809522 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-80a3-account-create-hjhw5"] Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810146 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a06f678-ac41-4482-8889-c85b2beb6703" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810158 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a06f678-ac41-4482-8889-c85b2beb6703" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810172 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810178 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810192 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon-log" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810199 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon-log" Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810208 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65194693-743e-4b92-9bb3-3a9e0ea49273" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810213 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="65194693-743e-4b92-9bb3-3a9e0ea49273" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810222 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ee6faf7-113e-4545-b3d5-e29e611c4f3e" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810228 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ee6faf7-113e-4545-b3d5-e29e611c4f3e" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810252 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-httpd" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810258 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-httpd" Sep 30 17:35:28 crc kubenswrapper[4778]: E0930 17:35:28.810265 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-api" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810271 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-api" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810414 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon-log" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810427 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ee6faf7-113e-4545-b3d5-e29e611c4f3e" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810439 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-api" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810446 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a06f678-ac41-4482-8889-c85b2beb6703" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810454 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="65194693-743e-4b92-9bb3-3a9e0ea49273" containerName="mariadb-database-create" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810462 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaa8a326-cf9a-4788-9e58-7c8bfbe92962" containerName="neutron-httpd" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810470 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b48eff09-cd27-48b0-9633-0fd28d43e0a0" containerName="horizon" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.810993 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.814837 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.820758 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-80a3-account-create-hjhw5"] Sep 30 17:35:28 crc kubenswrapper[4778]: I0930 17:35:28.921723 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mttp\" (UniqueName: \"kubernetes.io/projected/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db-kube-api-access-7mttp\") pod \"nova-api-80a3-account-create-hjhw5\" (UID: \"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db\") " pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.004794 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-e68e-account-create-2z8ks"] Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.006434 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.008721 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.019205 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e68e-account-create-2z8ks"] Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.020431 4778 generic.go:334] "Generic (PLEG): container finished" podID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerID="f78dc41c523244de92dbd3a1259d3cb4401e0a812fba5d6d8377e1001a06b0e7" exitCode=143 Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.020483 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02f673aa-9764-4ff8-af4c-42eeaefd2d05","Type":"ContainerDied","Data":"f78dc41c523244de92dbd3a1259d3cb4401e0a812fba5d6d8377e1001a06b0e7"} Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.023821 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mttp\" (UniqueName: \"kubernetes.io/projected/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db-kube-api-access-7mttp\") pod \"nova-api-80a3-account-create-hjhw5\" (UID: \"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db\") " pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.028673 4778 generic.go:334] "Generic (PLEG): container finished" podID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerID="d73ad7a8c5107070fcea7006bd4439a8f4e64b8854a15902161f47775ee98fd7" exitCode=143 Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.028714 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5fa61a80-4c21-425c-be4b-35ceac5c1d48","Type":"ContainerDied","Data":"d73ad7a8c5107070fcea7006bd4439a8f4e64b8854a15902161f47775ee98fd7"} Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.056098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mttp\" (UniqueName: \"kubernetes.io/projected/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db-kube-api-access-7mttp\") pod \"nova-api-80a3-account-create-hjhw5\" (UID: \"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db\") " pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.125181 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbfh\" (UniqueName: \"kubernetes.io/projected/330a8b1b-c4dc-40d1-9af3-ba2d151803aa-kube-api-access-jdbfh\") pod \"nova-cell0-e68e-account-create-2z8ks\" (UID: \"330a8b1b-c4dc-40d1-9af3-ba2d151803aa\") " pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.179839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.224667 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e14b-account-create-ssj7x"] Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.227282 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.229883 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.233900 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e14b-account-create-ssj7x"] Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.243681 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd9fd\" (UniqueName: \"kubernetes.io/projected/c2812856-059f-4b6c-975d-7fd7afcdcd05-kube-api-access-pd9fd\") pod \"nova-cell1-e14b-account-create-ssj7x\" (UID: \"c2812856-059f-4b6c-975d-7fd7afcdcd05\") " pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.243749 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbfh\" (UniqueName: \"kubernetes.io/projected/330a8b1b-c4dc-40d1-9af3-ba2d151803aa-kube-api-access-jdbfh\") pod \"nova-cell0-e68e-account-create-2z8ks\" (UID: \"330a8b1b-c4dc-40d1-9af3-ba2d151803aa\") " pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.261644 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbfh\" (UniqueName: \"kubernetes.io/projected/330a8b1b-c4dc-40d1-9af3-ba2d151803aa-kube-api-access-jdbfh\") pod \"nova-cell0-e68e-account-create-2z8ks\" (UID: \"330a8b1b-c4dc-40d1-9af3-ba2d151803aa\") " pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.322234 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.345084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd9fd\" (UniqueName: \"kubernetes.io/projected/c2812856-059f-4b6c-975d-7fd7afcdcd05-kube-api-access-pd9fd\") pod \"nova-cell1-e14b-account-create-ssj7x\" (UID: \"c2812856-059f-4b6c-975d-7fd7afcdcd05\") " pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.365759 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd9fd\" (UniqueName: \"kubernetes.io/projected/c2812856-059f-4b6c-975d-7fd7afcdcd05-kube-api-access-pd9fd\") pod \"nova-cell1-e14b-account-create-ssj7x\" (UID: \"c2812856-059f-4b6c-975d-7fd7afcdcd05\") " pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.638712 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.660659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-80a3-account-create-hjhw5"] Sep 30 17:35:29 crc kubenswrapper[4778]: W0930 17:35:29.667849 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb0646dac_3ae8_43a0_9a9a_6d0d6ad2f6db.slice/crio-a9318b46a8967766e5fa1774cd4f81c54f5c13914b6ee937aa48a80059e35d70 WatchSource:0}: Error finding container a9318b46a8967766e5fa1774cd4f81c54f5c13914b6ee937aa48a80059e35d70: Status 404 returned error can't find the container with id a9318b46a8967766e5fa1774cd4f81c54f5c13914b6ee937aa48a80059e35d70 Sep 30 17:35:29 crc kubenswrapper[4778]: I0930 17:35:29.843658 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-e68e-account-create-2z8ks"] Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.036861 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e14b-account-create-ssj7x"] Sep 30 17:35:30 crc kubenswrapper[4778]: W0930 17:35:30.037989 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2812856_059f_4b6c_975d_7fd7afcdcd05.slice/crio-04d69f1d859a290bba7bf6bf1ad673a82e9d6c21fda458b933b615ec6d1a9ac9 WatchSource:0}: Error finding container 04d69f1d859a290bba7bf6bf1ad673a82e9d6c21fda458b933b615ec6d1a9ac9: Status 404 returned error can't find the container with id 04d69f1d859a290bba7bf6bf1ad673a82e9d6c21fda458b933b615ec6d1a9ac9 Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.038691 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e68e-account-create-2z8ks" event={"ID":"330a8b1b-c4dc-40d1-9af3-ba2d151803aa","Type":"ContainerStarted","Data":"1e90c849502d652235be13096d7c92503893b8a572bd1b90f8686a8bb2782454"} Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.038718 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e68e-account-create-2z8ks" event={"ID":"330a8b1b-c4dc-40d1-9af3-ba2d151803aa","Type":"ContainerStarted","Data":"730b6916a8682c3f9cc91d8627fd951f0b9bff2baa918a9aa6e1acb0b7f47c85"} Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.040369 4778 generic.go:334] "Generic (PLEG): container finished" podID="b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db" containerID="339d10f03296fa4ecb05b496d6d8a27094539ab0c9299a9669948aee89faae53" exitCode=0 Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.040418 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-80a3-account-create-hjhw5" event={"ID":"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db","Type":"ContainerDied","Data":"339d10f03296fa4ecb05b496d6d8a27094539ab0c9299a9669948aee89faae53"} Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.040449 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-80a3-account-create-hjhw5" event={"ID":"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db","Type":"ContainerStarted","Data":"a9318b46a8967766e5fa1774cd4f81c54f5c13914b6ee937aa48a80059e35d70"} Sep 30 17:35:30 crc kubenswrapper[4778]: I0930 17:35:30.067243 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-e68e-account-create-2z8ks" podStartSLOduration=2.067223492 podStartE2EDuration="2.067223492s" podCreationTimestamp="2025-09-30 17:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:30.05224168 +0000 UTC m=+1069.042139483" watchObservedRunningTime="2025-09-30 17:35:30.067223492 +0000 UTC m=+1069.057121305" Sep 30 17:35:31 crc kubenswrapper[4778]: I0930 17:35:31.054536 4778 generic.go:334] "Generic (PLEG): container finished" podID="c2812856-059f-4b6c-975d-7fd7afcdcd05" containerID="0d626821f9389abdcfb4d97202069113d6c13603fcfac4f875d8ad7fd084e962" exitCode=0 Sep 30 17:35:31 crc kubenswrapper[4778]: I0930 17:35:31.054599 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e14b-account-create-ssj7x" event={"ID":"c2812856-059f-4b6c-975d-7fd7afcdcd05","Type":"ContainerDied","Data":"0d626821f9389abdcfb4d97202069113d6c13603fcfac4f875d8ad7fd084e962"} Sep 30 17:35:31 crc kubenswrapper[4778]: I0930 17:35:31.055957 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e14b-account-create-ssj7x" event={"ID":"c2812856-059f-4b6c-975d-7fd7afcdcd05","Type":"ContainerStarted","Data":"04d69f1d859a290bba7bf6bf1ad673a82e9d6c21fda458b933b615ec6d1a9ac9"} Sep 30 17:35:31 crc kubenswrapper[4778]: I0930 17:35:31.058493 4778 generic.go:334] "Generic (PLEG): container finished" podID="330a8b1b-c4dc-40d1-9af3-ba2d151803aa" containerID="1e90c849502d652235be13096d7c92503893b8a572bd1b90f8686a8bb2782454" exitCode=0 Sep 30 17:35:31 crc kubenswrapper[4778]: I0930 17:35:31.058544 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e68e-account-create-2z8ks" event={"ID":"330a8b1b-c4dc-40d1-9af3-ba2d151803aa","Type":"ContainerDied","Data":"1e90c849502d652235be13096d7c92503893b8a572bd1b90f8686a8bb2782454"} Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:31.448503 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:31.625010 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mttp\" (UniqueName: \"kubernetes.io/projected/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db-kube-api-access-7mttp\") pod \"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db\" (UID: \"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:31.646268 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db-kube-api-access-7mttp" (OuterVolumeSpecName: "kube-api-access-7mttp") pod "b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db" (UID: "b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db"). InnerVolumeSpecName "kube-api-access-7mttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:31.727678 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mttp\" (UniqueName: \"kubernetes.io/projected/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db-kube-api-access-7mttp\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.088203 4778 generic.go:334] "Generic (PLEG): container finished" podID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerID="14a58edb9355c8a4b88820e76d44c0d225196d8d2e1c0e6a91d9a8ae7b960902" exitCode=0 Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.088278 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02f673aa-9764-4ff8-af4c-42eeaefd2d05","Type":"ContainerDied","Data":"14a58edb9355c8a4b88820e76d44c0d225196d8d2e1c0e6a91d9a8ae7b960902"} Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.092717 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-80a3-account-create-hjhw5" event={"ID":"b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db","Type":"ContainerDied","Data":"a9318b46a8967766e5fa1774cd4f81c54f5c13914b6ee937aa48a80059e35d70"} Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.092749 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9318b46a8967766e5fa1774cd4f81c54f5c13914b6ee937aa48a80059e35d70" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.092761 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-80a3-account-create-hjhw5" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.094866 4778 generic.go:334] "Generic (PLEG): container finished" podID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerID="b5a435f83dc96d7d29c1d79b339d933b9c293bfdfabdfef8845954ecf9e7fdf4" exitCode=0 Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.094930 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5fa61a80-4c21-425c-be4b-35ceac5c1d48","Type":"ContainerDied","Data":"b5a435f83dc96d7d29c1d79b339d933b9c293bfdfabdfef8845954ecf9e7fdf4"} Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.343717 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.523452 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.527757 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.537504 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544164 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-scripts\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544215 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-config-data\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544260 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544279 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-public-tls-certs\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544326 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tflz4\" (UniqueName: \"kubernetes.io/projected/02f673aa-9764-4ff8-af4c-42eeaefd2d05-kube-api-access-tflz4\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544384 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-logs\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544404 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-combined-ca-bundle\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.544428 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-httpd-run\") pod \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\" (UID: \"02f673aa-9764-4ff8-af4c-42eeaefd2d05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.545757 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.546745 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-logs" (OuterVolumeSpecName: "logs") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.555806 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.580456 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02f673aa-9764-4ff8-af4c-42eeaefd2d05-kube-api-access-tflz4" (OuterVolumeSpecName: "kube-api-access-tflz4") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "kube-api-access-tflz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.582801 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-scripts" (OuterVolumeSpecName: "scripts") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.635377 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-config-data" (OuterVolumeSpecName: "config-data") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.643274 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647352 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-httpd-run\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647434 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljsfz\" (UniqueName: \"kubernetes.io/projected/5fa61a80-4c21-425c-be4b-35ceac5c1d48-kube-api-access-ljsfz\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647461 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-internal-tls-certs\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647549 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-logs\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-combined-ca-bundle\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647674 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd9fd\" (UniqueName: \"kubernetes.io/projected/c2812856-059f-4b6c-975d-7fd7afcdcd05-kube-api-access-pd9fd\") pod \"c2812856-059f-4b6c-975d-7fd7afcdcd05\" (UID: \"c2812856-059f-4b6c-975d-7fd7afcdcd05\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647727 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-scripts\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647767 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbfh\" (UniqueName: \"kubernetes.io/projected/330a8b1b-c4dc-40d1-9af3-ba2d151803aa-kube-api-access-jdbfh\") pod \"330a8b1b-c4dc-40d1-9af3-ba2d151803aa\" (UID: \"330a8b1b-c4dc-40d1-9af3-ba2d151803aa\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647822 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-config-data\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.647886 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\" (UID: \"5fa61a80-4c21-425c-be4b-35ceac5c1d48\") " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655548 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655583 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655604 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/02f673aa-9764-4ff8-af4c-42eeaefd2d05-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655633 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655644 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655682 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.655698 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tflz4\" (UniqueName: \"kubernetes.io/projected/02f673aa-9764-4ff8-af4c-42eeaefd2d05-kube-api-access-tflz4\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.660140 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.660765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-logs" (OuterVolumeSpecName: "logs") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.661517 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-scripts" (OuterVolumeSpecName: "scripts") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.674544 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fa61a80-4c21-425c-be4b-35ceac5c1d48-kube-api-access-ljsfz" (OuterVolumeSpecName: "kube-api-access-ljsfz") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "kube-api-access-ljsfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.674556 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.674768 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/330a8b1b-c4dc-40d1-9af3-ba2d151803aa-kube-api-access-jdbfh" (OuterVolumeSpecName: "kube-api-access-jdbfh") pod "330a8b1b-c4dc-40d1-9af3-ba2d151803aa" (UID: "330a8b1b-c4dc-40d1-9af3-ba2d151803aa"). InnerVolumeSpecName "kube-api-access-jdbfh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.678016 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2812856-059f-4b6c-975d-7fd7afcdcd05-kube-api-access-pd9fd" (OuterVolumeSpecName: "kube-api-access-pd9fd") pod "c2812856-059f-4b6c-975d-7fd7afcdcd05" (UID: "c2812856-059f-4b6c-975d-7fd7afcdcd05"). InnerVolumeSpecName "kube-api-access-pd9fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.708404 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.715439 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.721476 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "02f673aa-9764-4ff8-af4c-42eeaefd2d05" (UID: "02f673aa-9764-4ff8-af4c-42eeaefd2d05"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.744425 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-config-data" (OuterVolumeSpecName: "config-data") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.755647 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5fa61a80-4c21-425c-be4b-35ceac5c1d48" (UID: "5fa61a80-4c21-425c-be4b-35ceac5c1d48"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.764183 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.764319 4778 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766198 4778 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766233 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766244 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/02f673aa-9764-4ff8-af4c-42eeaefd2d05-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766257 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljsfz\" (UniqueName: \"kubernetes.io/projected/5fa61a80-4c21-425c-be4b-35ceac5c1d48-kube-api-access-ljsfz\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766267 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766275 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fa61a80-4c21-425c-be4b-35ceac5c1d48-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766283 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766293 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd9fd\" (UniqueName: \"kubernetes.io/projected/c2812856-059f-4b6c-975d-7fd7afcdcd05-kube-api-access-pd9fd\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766314 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fa61a80-4c21-425c-be4b-35ceac5c1d48-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.766341 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbfh\" (UniqueName: \"kubernetes.io/projected/330a8b1b-c4dc-40d1-9af3-ba2d151803aa-kube-api-access-jdbfh\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.782165 4778 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 17:35:32 crc kubenswrapper[4778]: I0930 17:35:32.868397 4778 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.105005 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-e68e-account-create-2z8ks" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.105017 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-e68e-account-create-2z8ks" event={"ID":"330a8b1b-c4dc-40d1-9af3-ba2d151803aa","Type":"ContainerDied","Data":"730b6916a8682c3f9cc91d8627fd951f0b9bff2baa918a9aa6e1acb0b7f47c85"} Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.105136 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730b6916a8682c3f9cc91d8627fd951f0b9bff2baa918a9aa6e1acb0b7f47c85" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.107082 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5fa61a80-4c21-425c-be4b-35ceac5c1d48","Type":"ContainerDied","Data":"4229c9e36d43de023ca7c8e1cb141fa684a883b91fa9f0f8d3deffa3fa26afc7"} Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.107128 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.107156 4778 scope.go:117] "RemoveContainer" containerID="b5a435f83dc96d7d29c1d79b339d933b9c293bfdfabdfef8845954ecf9e7fdf4" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.108964 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"02f673aa-9764-4ff8-af4c-42eeaefd2d05","Type":"ContainerDied","Data":"eb0aee333b62778fd1bcc2aa54f7ca550eb399fd537cc272d574c067f6bf9db9"} Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.109055 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.111524 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e14b-account-create-ssj7x" event={"ID":"c2812856-059f-4b6c-975d-7fd7afcdcd05","Type":"ContainerDied","Data":"04d69f1d859a290bba7bf6bf1ad673a82e9d6c21fda458b933b615ec6d1a9ac9"} Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.111559 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d69f1d859a290bba7bf6bf1ad673a82e9d6c21fda458b933b615ec6d1a9ac9" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.111605 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e14b-account-create-ssj7x" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.134694 4778 scope.go:117] "RemoveContainer" containerID="d73ad7a8c5107070fcea7006bd4439a8f4e64b8854a15902161f47775ee98fd7" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.157796 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.162199 4778 scope.go:117] "RemoveContainer" containerID="14a58edb9355c8a4b88820e76d44c0d225196d8d2e1c0e6a91d9a8ae7b960902" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.167798 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.176044 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.183164 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.186166 4778 scope.go:117] "RemoveContainer" containerID="f78dc41c523244de92dbd3a1259d3cb4401e0a812fba5d6d8377e1001a06b0e7" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.191153 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.191536 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="330a8b1b-c4dc-40d1-9af3-ba2d151803aa" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.191596 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="330a8b1b-c4dc-40d1-9af3-ba2d151803aa" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.191671 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-log" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.191729 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-log" Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.191803 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2812856-059f-4b6c-975d-7fd7afcdcd05" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.191850 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2812856-059f-4b6c-975d-7fd7afcdcd05" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.191923 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-httpd" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.191976 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-httpd" Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.192024 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-httpd" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192067 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-httpd" Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.192115 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192160 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: E0930 17:35:33.192217 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-log" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192281 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-log" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192513 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-httpd" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192585 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-log" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192650 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" containerName="glance-httpd" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.192712 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" containerName="glance-log" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.193346 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="330a8b1b-c4dc-40d1-9af3-ba2d151803aa" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.193404 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.195875 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2812856-059f-4b6c-975d-7fd7afcdcd05" containerName="mariadb-account-create" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.197051 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.204562 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.206327 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.207358 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.207632 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v8k48" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.207785 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.208536 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.208830 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.208953 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.210735 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.240114 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.375870 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-logs\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.375934 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs57z\" (UniqueName: \"kubernetes.io/projected/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-kube-api-access-zs57z\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.375963 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.375984 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376095 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376220 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376314 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376399 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vnzs\" (UniqueName: \"kubernetes.io/projected/df6e6a3a-5259-4462-9af0-439627e7cd46-kube-api-access-9vnzs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376427 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376474 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376675 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df6e6a3a-5259-4462-9af0-439627e7cd46-logs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376760 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376789 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.376821 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.377147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.377332 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df6e6a3a-5259-4462-9af0-439627e7cd46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478575 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478629 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478661 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478687 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478722 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478756 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vnzs\" (UniqueName: \"kubernetes.io/projected/df6e6a3a-5259-4462-9af0-439627e7cd46-kube-api-access-9vnzs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478780 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478804 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478835 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df6e6a3a-5259-4462-9af0-439627e7cd46-logs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478867 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478883 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478910 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478964 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df6e6a3a-5259-4462-9af0-439627e7cd46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.478989 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-logs\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.479012 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs57z\" (UniqueName: \"kubernetes.io/projected/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-kube-api-access-zs57z\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.479665 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.479735 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df6e6a3a-5259-4462-9af0-439627e7cd46-logs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.480386 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.480705 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df6e6a3a-5259-4462-9af0-439627e7cd46-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.481073 4778 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.481181 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-logs\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.483904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.484798 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.486395 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.483865 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.489580 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-scripts\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.490148 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-config-data\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.500582 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.502015 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df6e6a3a-5259-4462-9af0-439627e7cd46-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.504090 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs57z\" (UniqueName: \"kubernetes.io/projected/d1abcff4-99b6-41ab-a5b1-fb4c36f22711-kube-api-access-zs57z\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.522122 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vnzs\" (UniqueName: \"kubernetes.io/projected/df6e6a3a-5259-4462-9af0-439627e7cd46-kube-api-access-9vnzs\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.525504 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df6e6a3a-5259-4462-9af0-439627e7cd46\") " pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.536211 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"d1abcff4-99b6-41ab-a5b1-fb4c36f22711\") " pod="openstack/glance-default-external-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.726834 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02f673aa-9764-4ff8-af4c-42eeaefd2d05" path="/var/lib/kubelet/pods/02f673aa-9764-4ff8-af4c-42eeaefd2d05/volumes" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.727733 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fa61a80-4c21-425c-be4b-35ceac5c1d48" path="/var/lib/kubelet/pods/5fa61a80-4c21-425c-be4b-35ceac5c1d48/volumes" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.825691 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:33 crc kubenswrapper[4778]: I0930 17:35:33.836476 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.221974 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tcrrp"] Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.223446 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.226051 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.226228 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.226332 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pt2zr" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.236933 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tcrrp"] Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.403860 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp82g\" (UniqueName: \"kubernetes.io/projected/26eefd11-813a-4df9-831d-0a9e60b1be73-kube-api-access-tp82g\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.403928 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-scripts\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.403971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-config-data\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.404001 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.480154 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.505852 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp82g\" (UniqueName: \"kubernetes.io/projected/26eefd11-813a-4df9-831d-0a9e60b1be73-kube-api-access-tp82g\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.505903 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-scripts\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.505936 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-config-data\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.505960 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.512519 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-config-data\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.520216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-scripts\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.527225 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp82g\" (UniqueName: \"kubernetes.io/projected/26eefd11-813a-4df9-831d-0a9e60b1be73-kube-api-access-tp82g\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.529999 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-tcrrp\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.544978 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:34 crc kubenswrapper[4778]: I0930 17:35:34.601177 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 17:35:34 crc kubenswrapper[4778]: W0930 17:35:34.612865 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1abcff4_99b6_41ab_a5b1_fb4c36f22711.slice/crio-ede1fa8da25a8f2a75d61f643a48dab8e86be3eff1a2a85d42e661bd6e0d0710 WatchSource:0}: Error finding container ede1fa8da25a8f2a75d61f643a48dab8e86be3eff1a2a85d42e661bd6e0d0710: Status 404 returned error can't find the container with id ede1fa8da25a8f2a75d61f643a48dab8e86be3eff1a2a85d42e661bd6e0d0710 Sep 30 17:35:35 crc kubenswrapper[4778]: I0930 17:35:35.005521 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tcrrp"] Sep 30 17:35:35 crc kubenswrapper[4778]: I0930 17:35:35.145951 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" event={"ID":"26eefd11-813a-4df9-831d-0a9e60b1be73","Type":"ContainerStarted","Data":"fd093f5fb4c11513532b1d735507cffabccffab8575d4b97a67d436191b842fd"} Sep 30 17:35:35 crc kubenswrapper[4778]: I0930 17:35:35.148193 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df6e6a3a-5259-4462-9af0-439627e7cd46","Type":"ContainerStarted","Data":"7615ea436f26b31766d72e70d34e255a79f7857244152acd20d83fa8d0aa2efa"} Sep 30 17:35:35 crc kubenswrapper[4778]: I0930 17:35:35.148225 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df6e6a3a-5259-4462-9af0-439627e7cd46","Type":"ContainerStarted","Data":"90f0f36c46b0aca7437784604984c11d88611f239acf693cb29b9e1daa3682a5"} Sep 30 17:35:35 crc kubenswrapper[4778]: I0930 17:35:35.149752 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1abcff4-99b6-41ab-a5b1-fb4c36f22711","Type":"ContainerStarted","Data":"36181c511a45ba0e5df294a75bd8366611b55f32dfa86bd981f062f7c5c1d031"} Sep 30 17:35:35 crc kubenswrapper[4778]: I0930 17:35:35.149799 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1abcff4-99b6-41ab-a5b1-fb4c36f22711","Type":"ContainerStarted","Data":"ede1fa8da25a8f2a75d61f643a48dab8e86be3eff1a2a85d42e661bd6e0d0710"} Sep 30 17:35:36 crc kubenswrapper[4778]: I0930 17:35:36.159441 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d1abcff4-99b6-41ab-a5b1-fb4c36f22711","Type":"ContainerStarted","Data":"cd93e60d986787cb85dffbb32a6949714d89b603e2b7440b2ad61f496e55d520"} Sep 30 17:35:36 crc kubenswrapper[4778]: I0930 17:35:36.162791 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df6e6a3a-5259-4462-9af0-439627e7cd46","Type":"ContainerStarted","Data":"f61a8b189760121caf13c0ebe29a6ad17921bdb8cd05437120746bcf666320d8"} Sep 30 17:35:36 crc kubenswrapper[4778]: I0930 17:35:36.179768 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.179609601 podStartE2EDuration="3.179609601s" podCreationTimestamp="2025-09-30 17:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:36.177123023 +0000 UTC m=+1075.167020826" watchObservedRunningTime="2025-09-30 17:35:36.179609601 +0000 UTC m=+1075.169507404" Sep 30 17:35:36 crc kubenswrapper[4778]: I0930 17:35:36.204789 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.204766554 podStartE2EDuration="3.204766554s" podCreationTimestamp="2025-09-30 17:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:36.201051698 +0000 UTC m=+1075.190949521" watchObservedRunningTime="2025-09-30 17:35:36.204766554 +0000 UTC m=+1075.194664357" Sep 30 17:35:42 crc kubenswrapper[4778]: I0930 17:35:42.223566 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" event={"ID":"26eefd11-813a-4df9-831d-0a9e60b1be73","Type":"ContainerStarted","Data":"9d0e17aa62562f2605b6e9bc77b22cf7df4657dffc7b25b0b126e77378898822"} Sep 30 17:35:42 crc kubenswrapper[4778]: I0930 17:35:42.252379 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" podStartSLOduration=1.858398995 podStartE2EDuration="8.25235975s" podCreationTimestamp="2025-09-30 17:35:34 +0000 UTC" firstStartedPulling="2025-09-30 17:35:35.011084514 +0000 UTC m=+1074.000982317" lastFinishedPulling="2025-09-30 17:35:41.405045279 +0000 UTC m=+1080.394943072" observedRunningTime="2025-09-30 17:35:42.238273775 +0000 UTC m=+1081.228171578" watchObservedRunningTime="2025-09-30 17:35:42.25235975 +0000 UTC m=+1081.242257553" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.825874 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.826302 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.837541 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.837612 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.862267 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.884360 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.886695 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 17:35:43 crc kubenswrapper[4778]: I0930 17:35:43.904199 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 17:35:44 crc kubenswrapper[4778]: I0930 17:35:44.244985 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 17:35:44 crc kubenswrapper[4778]: I0930 17:35:44.245030 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:44 crc kubenswrapper[4778]: I0930 17:35:44.245045 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 17:35:44 crc kubenswrapper[4778]: I0930 17:35:44.245054 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:46 crc kubenswrapper[4778]: I0930 17:35:46.107087 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:46 crc kubenswrapper[4778]: I0930 17:35:46.180609 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 17:35:46 crc kubenswrapper[4778]: I0930 17:35:46.198484 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 17:35:46 crc kubenswrapper[4778]: I0930 17:35:46.252224 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 17:35:52 crc kubenswrapper[4778]: I0930 17:35:52.332989 4778 generic.go:334] "Generic (PLEG): container finished" podID="26eefd11-813a-4df9-831d-0a9e60b1be73" containerID="9d0e17aa62562f2605b6e9bc77b22cf7df4657dffc7b25b0b126e77378898822" exitCode=0 Sep 30 17:35:52 crc kubenswrapper[4778]: I0930 17:35:52.333063 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" event={"ID":"26eefd11-813a-4df9-831d-0a9e60b1be73","Type":"ContainerDied","Data":"9d0e17aa62562f2605b6e9bc77b22cf7df4657dffc7b25b0b126e77378898822"} Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.693723 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.873967 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp82g\" (UniqueName: \"kubernetes.io/projected/26eefd11-813a-4df9-831d-0a9e60b1be73-kube-api-access-tp82g\") pod \"26eefd11-813a-4df9-831d-0a9e60b1be73\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.874429 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-scripts\") pod \"26eefd11-813a-4df9-831d-0a9e60b1be73\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.874521 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-config-data\") pod \"26eefd11-813a-4df9-831d-0a9e60b1be73\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.874566 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-combined-ca-bundle\") pod \"26eefd11-813a-4df9-831d-0a9e60b1be73\" (UID: \"26eefd11-813a-4df9-831d-0a9e60b1be73\") " Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.879748 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26eefd11-813a-4df9-831d-0a9e60b1be73-kube-api-access-tp82g" (OuterVolumeSpecName: "kube-api-access-tp82g") pod "26eefd11-813a-4df9-831d-0a9e60b1be73" (UID: "26eefd11-813a-4df9-831d-0a9e60b1be73"). InnerVolumeSpecName "kube-api-access-tp82g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.879993 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-scripts" (OuterVolumeSpecName: "scripts") pod "26eefd11-813a-4df9-831d-0a9e60b1be73" (UID: "26eefd11-813a-4df9-831d-0a9e60b1be73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.907212 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "26eefd11-813a-4df9-831d-0a9e60b1be73" (UID: "26eefd11-813a-4df9-831d-0a9e60b1be73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.926484 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-config-data" (OuterVolumeSpecName: "config-data") pod "26eefd11-813a-4df9-831d-0a9e60b1be73" (UID: "26eefd11-813a-4df9-831d-0a9e60b1be73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.977417 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp82g\" (UniqueName: \"kubernetes.io/projected/26eefd11-813a-4df9-831d-0a9e60b1be73-kube-api-access-tp82g\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.977481 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.977501 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:53 crc kubenswrapper[4778]: I0930 17:35:53.977520 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/26eefd11-813a-4df9-831d-0a9e60b1be73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.353547 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" event={"ID":"26eefd11-813a-4df9-831d-0a9e60b1be73","Type":"ContainerDied","Data":"fd093f5fb4c11513532b1d735507cffabccffab8575d4b97a67d436191b842fd"} Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.353605 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd093f5fb4c11513532b1d735507cffabccffab8575d4b97a67d436191b842fd" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.353736 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-tcrrp" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.698247 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 17:35:54 crc kubenswrapper[4778]: E0930 17:35:54.698838 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26eefd11-813a-4df9-831d-0a9e60b1be73" containerName="nova-cell0-conductor-db-sync" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.698863 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="26eefd11-813a-4df9-831d-0a9e60b1be73" containerName="nova-cell0-conductor-db-sync" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.699214 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="26eefd11-813a-4df9-831d-0a9e60b1be73" containerName="nova-cell0-conductor-db-sync" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.700171 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.702982 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-pt2zr" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.703218 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.710493 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.791683 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb567d62-92bc-46d5-a998-e96a2469b117-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.791733 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb567d62-92bc-46d5-a998-e96a2469b117-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.791970 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gd56\" (UniqueName: \"kubernetes.io/projected/bb567d62-92bc-46d5-a998-e96a2469b117-kube-api-access-9gd56\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.893532 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gd56\" (UniqueName: \"kubernetes.io/projected/bb567d62-92bc-46d5-a998-e96a2469b117-kube-api-access-9gd56\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.893598 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb567d62-92bc-46d5-a998-e96a2469b117-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.893690 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb567d62-92bc-46d5-a998-e96a2469b117-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.900692 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bb567d62-92bc-46d5-a998-e96a2469b117-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.900821 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb567d62-92bc-46d5-a998-e96a2469b117-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:54 crc kubenswrapper[4778]: I0930 17:35:54.921469 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gd56\" (UniqueName: \"kubernetes.io/projected/bb567d62-92bc-46d5-a998-e96a2469b117-kube-api-access-9gd56\") pod \"nova-cell0-conductor-0\" (UID: \"bb567d62-92bc-46d5-a998-e96a2469b117\") " pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:55 crc kubenswrapper[4778]: I0930 17:35:55.033942 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:55 crc kubenswrapper[4778]: I0930 17:35:55.307939 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 17:35:55 crc kubenswrapper[4778]: W0930 17:35:55.308506 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbb567d62_92bc_46d5_a998_e96a2469b117.slice/crio-54b4046e7da9cf4d4ecbc9b7221f9c030d253e04b0013d5bb06995f9968f22cb WatchSource:0}: Error finding container 54b4046e7da9cf4d4ecbc9b7221f9c030d253e04b0013d5bb06995f9968f22cb: Status 404 returned error can't find the container with id 54b4046e7da9cf4d4ecbc9b7221f9c030d253e04b0013d5bb06995f9968f22cb Sep 30 17:35:55 crc kubenswrapper[4778]: I0930 17:35:55.366316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb567d62-92bc-46d5-a998-e96a2469b117","Type":"ContainerStarted","Data":"54b4046e7da9cf4d4ecbc9b7221f9c030d253e04b0013d5bb06995f9968f22cb"} Sep 30 17:35:56 crc kubenswrapper[4778]: I0930 17:35:56.381405 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"bb567d62-92bc-46d5-a998-e96a2469b117","Type":"ContainerStarted","Data":"74574d4cff646dcfee60bb53e6ff4a0b47dce95b4505f2debedbb77be3ebf490"} Sep 30 17:35:56 crc kubenswrapper[4778]: I0930 17:35:56.381843 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 17:35:56 crc kubenswrapper[4778]: I0930 17:35:56.432034 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.432006603 podStartE2EDuration="2.432006603s" podCreationTimestamp="2025-09-30 17:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:35:56.410864125 +0000 UTC m=+1095.400761968" watchObservedRunningTime="2025-09-30 17:35:56.432006603 +0000 UTC m=+1095.421904446" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.070703 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.655173 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-qf925"] Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.657838 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.663077 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.663466 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.669072 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qf925"] Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.714225 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b7xj\" (UniqueName: \"kubernetes.io/projected/ca4b7091-db1d-4b92-9009-ccafd842d405-kube-api-access-2b7xj\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.714391 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-scripts\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.714470 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-config-data\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.714512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.815951 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-config-data\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.816013 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.816084 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b7xj\" (UniqueName: \"kubernetes.io/projected/ca4b7091-db1d-4b92-9009-ccafd842d405-kube-api-access-2b7xj\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.816185 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-scripts\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.826532 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.827038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-scripts\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.830011 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-config-data\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.851033 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b7xj\" (UniqueName: \"kubernetes.io/projected/ca4b7091-db1d-4b92-9009-ccafd842d405-kube-api-access-2b7xj\") pod \"nova-cell0-cell-mapping-qf925\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.854038 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.860272 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.863694 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.877902 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.918410 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.918464 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tqw\" (UniqueName: \"kubernetes.io/projected/f5995b85-629b-4772-85b4-22753e1935a0-kube-api-access-p2tqw\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.918512 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5995b85-629b-4772-85b4-22753e1935a0-logs\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.918601 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-config-data\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.937368 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.938709 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.941256 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:36:00 crc kubenswrapper[4778]: I0930 17:36:00.982062 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.023703 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.023927 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tqw\" (UniqueName: \"kubernetes.io/projected/f5995b85-629b-4772-85b4-22753e1935a0-kube-api-access-p2tqw\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.024025 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5995b85-629b-4772-85b4-22753e1935a0-logs\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.024863 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5995b85-629b-4772-85b4-22753e1935a0-logs\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.024882 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-config-data\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.029996 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.036161 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.045323 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-config-data\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.070777 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-745f868dcf-njjc6"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.088674 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.108361 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tqw\" (UniqueName: \"kubernetes.io/projected/f5995b85-629b-4772-85b4-22753e1935a0-kube-api-access-p2tqw\") pod \"nova-api-0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.114688 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.115976 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.118517 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.130445 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134569 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-config-data\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134641 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-config\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134682 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-nb\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134712 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134729 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-sb\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134754 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzb95\" (UniqueName: \"kubernetes.io/projected/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-kube-api-access-pzb95\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134776 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-config-data\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134794 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-logs\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134810 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134827 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb2vd\" (UniqueName: \"kubernetes.io/projected/2f31c368-4084-49e9-8d47-c7eb8541b0d2-kube-api-access-nb2vd\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134856 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7kwr\" (UniqueName: \"kubernetes.io/projected/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-kube-api-access-s7kwr\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.134901 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-dns-svc\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.146681 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745f868dcf-njjc6"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.185172 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.186864 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.190359 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.206351 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.236665 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-dns-svc\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.236737 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-config-data\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.236770 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhjk\" (UniqueName: \"kubernetes.io/projected/9171d6da-2c81-45f9-a295-92b42f5a6c09-kube-api-access-lxhjk\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.236778 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.236801 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-config\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237088 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-nb\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237162 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237201 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-sb\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237237 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237262 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237300 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzb95\" (UniqueName: \"kubernetes.io/projected/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-kube-api-access-pzb95\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237880 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-dns-svc\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.237946 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-config\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.238318 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-nb\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.238816 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-config-data\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.238865 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-logs\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.238896 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.238914 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb2vd\" (UniqueName: \"kubernetes.io/projected/2f31c368-4084-49e9-8d47-c7eb8541b0d2-kube-api-access-nb2vd\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.238973 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7kwr\" (UniqueName: \"kubernetes.io/projected/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-kube-api-access-s7kwr\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.240021 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-logs\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.242002 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.242574 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-sb\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.242840 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-config-data\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.258813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.261820 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-config-data\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.264076 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb2vd\" (UniqueName: \"kubernetes.io/projected/2f31c368-4084-49e9-8d47-c7eb8541b0d2-kube-api-access-nb2vd\") pod \"dnsmasq-dns-745f868dcf-njjc6\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.268924 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7kwr\" (UniqueName: \"kubernetes.io/projected/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-kube-api-access-s7kwr\") pod \"nova-metadata-0\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.270813 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzb95\" (UniqueName: \"kubernetes.io/projected/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-kube-api-access-pzb95\") pod \"nova-scheduler-0\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.311749 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.342158 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhjk\" (UniqueName: \"kubernetes.io/projected/9171d6da-2c81-45f9-a295-92b42f5a6c09-kube-api-access-lxhjk\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.342280 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.342302 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.348169 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.354256 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.361051 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhjk\" (UniqueName: \"kubernetes.io/projected/9171d6da-2c81-45f9-a295-92b42f5a6c09-kube-api-access-lxhjk\") pod \"nova-cell1-novncproxy-0\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.503240 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.523988 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.530266 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.566243 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-qf925"] Sep 30 17:36:01 crc kubenswrapper[4778]: W0930 17:36:01.580348 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca4b7091_db1d_4b92_9009_ccafd842d405.slice/crio-abc7a31cdb4ec40433ac0e6b7eb454a1041026f6c45e0976da4036637f696f4b WatchSource:0}: Error finding container abc7a31cdb4ec40433ac0e6b7eb454a1041026f6c45e0976da4036637f696f4b: Status 404 returned error can't find the container with id abc7a31cdb4ec40433ac0e6b7eb454a1041026f6c45e0976da4036637f696f4b Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.667486 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7pc2n"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.668935 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.672542 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.672713 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.684953 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7pc2n"] Sep 30 17:36:01 crc kubenswrapper[4778]: W0930 17:36:01.732535 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5995b85_629b_4772_85b4_22753e1935a0.slice/crio-1aba9f6dfe8e5c515c6cb30a4eb9ecf0347dd0f546303b8e55be8637d4be00e4 WatchSource:0}: Error finding container 1aba9f6dfe8e5c515c6cb30a4eb9ecf0347dd0f546303b8e55be8637d4be00e4: Status 404 returned error can't find the container with id 1aba9f6dfe8e5c515c6cb30a4eb9ecf0347dd0f546303b8e55be8637d4be00e4 Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.739073 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.754529 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-config-data\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.754575 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-scripts\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.754601 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.754669 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbmsw\" (UniqueName: \"kubernetes.io/projected/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-kube-api-access-bbmsw\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.841203 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:01 crc kubenswrapper[4778]: W0930 17:36:01.854930 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54b3d113_98b5_4201_bef7_a7c3d7fe1c5b.slice/crio-b91393f58a3da5e82daa17a5dc4f0e06fe41ab81e8d0c8a701b6f1b2a818ed86 WatchSource:0}: Error finding container b91393f58a3da5e82daa17a5dc4f0e06fe41ab81e8d0c8a701b6f1b2a818ed86: Status 404 returned error can't find the container with id b91393f58a3da5e82daa17a5dc4f0e06fe41ab81e8d0c8a701b6f1b2a818ed86 Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.857591 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-config-data\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.857662 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-scripts\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.857712 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.859379 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbmsw\" (UniqueName: \"kubernetes.io/projected/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-kube-api-access-bbmsw\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.862656 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-scripts\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.874523 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.877775 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-config-data\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:01 crc kubenswrapper[4778]: I0930 17:36:01.884276 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbmsw\" (UniqueName: \"kubernetes.io/projected/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-kube-api-access-bbmsw\") pod \"nova-cell1-conductor-db-sync-7pc2n\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.002823 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.047920 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.081216 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-745f868dcf-njjc6"] Sep 30 17:36:02 crc kubenswrapper[4778]: W0930 17:36:02.097733 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f31c368_4084_49e9_8d47_c7eb8541b0d2.slice/crio-fdca139a5d23e24a602d9a24c1b9642a0b24ff7d0d88aa2ad9eb13aa8893df12 WatchSource:0}: Error finding container fdca139a5d23e24a602d9a24c1b9642a0b24ff7d0d88aa2ad9eb13aa8893df12: Status 404 returned error can't find the container with id fdca139a5d23e24a602d9a24c1b9642a0b24ff7d0d88aa2ad9eb13aa8893df12 Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.165843 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.443964 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9171d6da-2c81-45f9-a295-92b42f5a6c09","Type":"ContainerStarted","Data":"7a08f3fc8ef43cbdf84a0fb09ae769009f27cecf66344b00d9e919b558572f12"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.448249 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c","Type":"ContainerStarted","Data":"0e27c6c8bc653f652cf2800a5bca33ed2b16fe71270303b9993ac44ccd7720ba"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.450404 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qf925" event={"ID":"ca4b7091-db1d-4b92-9009-ccafd842d405","Type":"ContainerStarted","Data":"50699ae44e61c9c97eb836537114a703db357385e7a871995a01d573e5661fa9"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.450709 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qf925" event={"ID":"ca4b7091-db1d-4b92-9009-ccafd842d405","Type":"ContainerStarted","Data":"abc7a31cdb4ec40433ac0e6b7eb454a1041026f6c45e0976da4036637f696f4b"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.453701 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerID="0cca25f47fab96516cced40e050dd5b41df8ec24b24e971fb35e4eea27f0fbab" exitCode=0 Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.453790 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" event={"ID":"2f31c368-4084-49e9-8d47-c7eb8541b0d2","Type":"ContainerDied","Data":"0cca25f47fab96516cced40e050dd5b41df8ec24b24e971fb35e4eea27f0fbab"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.453818 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" event={"ID":"2f31c368-4084-49e9-8d47-c7eb8541b0d2","Type":"ContainerStarted","Data":"fdca139a5d23e24a602d9a24c1b9642a0b24ff7d0d88aa2ad9eb13aa8893df12"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.455319 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b","Type":"ContainerStarted","Data":"b91393f58a3da5e82daa17a5dc4f0e06fe41ab81e8d0c8a701b6f1b2a818ed86"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.456701 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5995b85-629b-4772-85b4-22753e1935a0","Type":"ContainerStarted","Data":"1aba9f6dfe8e5c515c6cb30a4eb9ecf0347dd0f546303b8e55be8637d4be00e4"} Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.470958 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-qf925" podStartSLOduration=2.470939023 podStartE2EDuration="2.470939023s" podCreationTimestamp="2025-09-30 17:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:02.466741631 +0000 UTC m=+1101.456639474" watchObservedRunningTime="2025-09-30 17:36:02.470939023 +0000 UTC m=+1101.460836826" Sep 30 17:36:02 crc kubenswrapper[4778]: W0930 17:36:02.549356 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ef25b93_966b_4fc0_85e9_9d6c1cbfb9b6.slice/crio-98ffec1f268770cf7a881be2d8bf72a14195e30e91ec42b7612b3c2b57270099 WatchSource:0}: Error finding container 98ffec1f268770cf7a881be2d8bf72a14195e30e91ec42b7612b3c2b57270099: Status 404 returned error can't find the container with id 98ffec1f268770cf7a881be2d8bf72a14195e30e91ec42b7612b3c2b57270099 Sep 30 17:36:02 crc kubenswrapper[4778]: I0930 17:36:02.549832 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7pc2n"] Sep 30 17:36:03 crc kubenswrapper[4778]: I0930 17:36:03.465696 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" event={"ID":"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6","Type":"ContainerStarted","Data":"419f04cd88fee8958a7bee7106a0501124468df502b036ffa77804cd1497f03a"} Sep 30 17:36:03 crc kubenswrapper[4778]: I0930 17:36:03.466051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" event={"ID":"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6","Type":"ContainerStarted","Data":"98ffec1f268770cf7a881be2d8bf72a14195e30e91ec42b7612b3c2b57270099"} Sep 30 17:36:03 crc kubenswrapper[4778]: I0930 17:36:03.469591 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" event={"ID":"2f31c368-4084-49e9-8d47-c7eb8541b0d2","Type":"ContainerStarted","Data":"9399b08040b6e3234013278aa69986bf9af8df5134d42c7c0dfbfb4397c02dd0"} Sep 30 17:36:03 crc kubenswrapper[4778]: I0930 17:36:03.469834 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:03 crc kubenswrapper[4778]: I0930 17:36:03.488530 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" podStartSLOduration=2.488508846 podStartE2EDuration="2.488508846s" podCreationTimestamp="2025-09-30 17:36:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:03.48386253 +0000 UTC m=+1102.473760323" watchObservedRunningTime="2025-09-30 17:36:03.488508846 +0000 UTC m=+1102.478406649" Sep 30 17:36:03 crc kubenswrapper[4778]: I0930 17:36:03.509872 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" podStartSLOduration=3.5098518800000003 podStartE2EDuration="3.50985188s" podCreationTimestamp="2025-09-30 17:36:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:03.501482876 +0000 UTC m=+1102.491380759" watchObservedRunningTime="2025-09-30 17:36:03.50985188 +0000 UTC m=+1102.499749703" Sep 30 17:36:04 crc kubenswrapper[4778]: I0930 17:36:04.456392 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:04 crc kubenswrapper[4778]: I0930 17:36:04.467277 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.484430 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9171d6da-2c81-45f9-a295-92b42f5a6c09","Type":"ContainerStarted","Data":"499ca2af5f7a85a01e8327be79b09172d708b959d5612515a4694028383701a3"} Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.484547 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="9171d6da-2c81-45f9-a295-92b42f5a6c09" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://499ca2af5f7a85a01e8327be79b09172d708b959d5612515a4694028383701a3" gracePeriod=30 Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.491343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c","Type":"ContainerStarted","Data":"553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165"} Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.495796 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b","Type":"ContainerStarted","Data":"30ec4090d9d28a7ce8850595f63fb25489ef44a273ee5eaad477b8790f6b8d54"} Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.495848 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b","Type":"ContainerStarted","Data":"877f10dffd5bf369aef3fa3bc28f27b41aaf09ced43615e8ba750ad4489869a9"} Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.495935 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-log" containerID="cri-o://877f10dffd5bf369aef3fa3bc28f27b41aaf09ced43615e8ba750ad4489869a9" gracePeriod=30 Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.495964 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-metadata" containerID="cri-o://30ec4090d9d28a7ce8850595f63fb25489ef44a273ee5eaad477b8790f6b8d54" gracePeriod=30 Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.497852 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5995b85-629b-4772-85b4-22753e1935a0","Type":"ContainerStarted","Data":"cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110"} Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.497897 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5995b85-629b-4772-85b4-22753e1935a0","Type":"ContainerStarted","Data":"ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa"} Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.508771 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.72052223 podStartE2EDuration="4.508751973s" podCreationTimestamp="2025-09-30 17:36:01 +0000 UTC" firstStartedPulling="2025-09-30 17:36:02.066852221 +0000 UTC m=+1101.056750024" lastFinishedPulling="2025-09-30 17:36:04.855081964 +0000 UTC m=+1103.844979767" observedRunningTime="2025-09-30 17:36:05.501064111 +0000 UTC m=+1104.490961914" watchObservedRunningTime="2025-09-30 17:36:05.508751973 +0000 UTC m=+1104.498649776" Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.522424 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.885980322 podStartE2EDuration="4.522408284s" podCreationTimestamp="2025-09-30 17:36:01 +0000 UTC" firstStartedPulling="2025-09-30 17:36:02.251086415 +0000 UTC m=+1101.240984218" lastFinishedPulling="2025-09-30 17:36:04.887514377 +0000 UTC m=+1103.877412180" observedRunningTime="2025-09-30 17:36:05.516485307 +0000 UTC m=+1104.506383100" watchObservedRunningTime="2025-09-30 17:36:05.522408284 +0000 UTC m=+1104.512306087" Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.541896 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.423769555 podStartE2EDuration="5.541879748s" podCreationTimestamp="2025-09-30 17:36:00 +0000 UTC" firstStartedPulling="2025-09-30 17:36:01.737958241 +0000 UTC m=+1100.727856044" lastFinishedPulling="2025-09-30 17:36:04.856068434 +0000 UTC m=+1103.845966237" observedRunningTime="2025-09-30 17:36:05.535734405 +0000 UTC m=+1104.525632218" watchObservedRunningTime="2025-09-30 17:36:05.541879748 +0000 UTC m=+1104.531777551" Sep 30 17:36:05 crc kubenswrapper[4778]: I0930 17:36:05.557952 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.561403498 podStartE2EDuration="5.557921484s" podCreationTimestamp="2025-09-30 17:36:00 +0000 UTC" firstStartedPulling="2025-09-30 17:36:01.858439453 +0000 UTC m=+1100.848337256" lastFinishedPulling="2025-09-30 17:36:04.854957439 +0000 UTC m=+1103.844855242" observedRunningTime="2025-09-30 17:36:05.554401773 +0000 UTC m=+1104.544299566" watchObservedRunningTime="2025-09-30 17:36:05.557921484 +0000 UTC m=+1104.547819297" Sep 30 17:36:06 crc kubenswrapper[4778]: I0930 17:36:06.312106 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:36:06 crc kubenswrapper[4778]: I0930 17:36:06.312162 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:36:06 crc kubenswrapper[4778]: I0930 17:36:06.509245 4778 generic.go:334] "Generic (PLEG): container finished" podID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerID="877f10dffd5bf369aef3fa3bc28f27b41aaf09ced43615e8ba750ad4489869a9" exitCode=143 Sep 30 17:36:06 crc kubenswrapper[4778]: I0930 17:36:06.509341 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b","Type":"ContainerDied","Data":"877f10dffd5bf369aef3fa3bc28f27b41aaf09ced43615e8ba750ad4489869a9"} Sep 30 17:36:06 crc kubenswrapper[4778]: I0930 17:36:06.524854 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 17:36:06 crc kubenswrapper[4778]: I0930 17:36:06.531008 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:09 crc kubenswrapper[4778]: I0930 17:36:09.533459 4778 generic.go:334] "Generic (PLEG): container finished" podID="ca4b7091-db1d-4b92-9009-ccafd842d405" containerID="50699ae44e61c9c97eb836537114a703db357385e7a871995a01d573e5661fa9" exitCode=0 Sep 30 17:36:09 crc kubenswrapper[4778]: I0930 17:36:09.533553 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qf925" event={"ID":"ca4b7091-db1d-4b92-9009-ccafd842d405","Type":"ContainerDied","Data":"50699ae44e61c9c97eb836537114a703db357385e7a871995a01d573e5661fa9"} Sep 30 17:36:10 crc kubenswrapper[4778]: I0930 17:36:10.547357 4778 generic.go:334] "Generic (PLEG): container finished" podID="4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" containerID="419f04cd88fee8958a7bee7106a0501124468df502b036ffa77804cd1497f03a" exitCode=0 Sep 30 17:36:10 crc kubenswrapper[4778]: I0930 17:36:10.547460 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" event={"ID":"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6","Type":"ContainerDied","Data":"419f04cd88fee8958a7bee7106a0501124468df502b036ffa77804cd1497f03a"} Sep 30 17:36:10 crc kubenswrapper[4778]: I0930 17:36:10.898488 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.044898 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b7xj\" (UniqueName: \"kubernetes.io/projected/ca4b7091-db1d-4b92-9009-ccafd842d405-kube-api-access-2b7xj\") pod \"ca4b7091-db1d-4b92-9009-ccafd842d405\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.046140 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-config-data\") pod \"ca4b7091-db1d-4b92-9009-ccafd842d405\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.046197 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-combined-ca-bundle\") pod \"ca4b7091-db1d-4b92-9009-ccafd842d405\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.046334 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-scripts\") pod \"ca4b7091-db1d-4b92-9009-ccafd842d405\" (UID: \"ca4b7091-db1d-4b92-9009-ccafd842d405\") " Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.052273 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4b7091-db1d-4b92-9009-ccafd842d405-kube-api-access-2b7xj" (OuterVolumeSpecName: "kube-api-access-2b7xj") pod "ca4b7091-db1d-4b92-9009-ccafd842d405" (UID: "ca4b7091-db1d-4b92-9009-ccafd842d405"). InnerVolumeSpecName "kube-api-access-2b7xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.052488 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-scripts" (OuterVolumeSpecName: "scripts") pod "ca4b7091-db1d-4b92-9009-ccafd842d405" (UID: "ca4b7091-db1d-4b92-9009-ccafd842d405"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.074722 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-config-data" (OuterVolumeSpecName: "config-data") pod "ca4b7091-db1d-4b92-9009-ccafd842d405" (UID: "ca4b7091-db1d-4b92-9009-ccafd842d405"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.082679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4b7091-db1d-4b92-9009-ccafd842d405" (UID: "ca4b7091-db1d-4b92-9009-ccafd842d405"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.148413 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b7xj\" (UniqueName: \"kubernetes.io/projected/ca4b7091-db1d-4b92-9009-ccafd842d405-kube-api-access-2b7xj\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.148458 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.148473 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.148486 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca4b7091-db1d-4b92-9009-ccafd842d405-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.237018 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.237115 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.505771 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.524866 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.566582 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-qf925" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.577842 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-qf925" event={"ID":"ca4b7091-db1d-4b92-9009-ccafd842d405","Type":"ContainerDied","Data":"abc7a31cdb4ec40433ac0e6b7eb454a1041026f6c45e0976da4036637f696f4b"} Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.578135 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abc7a31cdb4ec40433ac0e6b7eb454a1041026f6c45e0976da4036637f696f4b" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.595000 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.601118 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-f9s5c"] Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.608784 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerName="dnsmasq-dns" containerID="cri-o://c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06" gracePeriod=10 Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.677177 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.761728 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.761900 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-log" containerID="cri-o://ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa" gracePeriod=30 Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.762262 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-api" containerID="cri-o://cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110" gracePeriod=30 Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.771695 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": EOF" Sep 30 17:36:11 crc kubenswrapper[4778]: I0930 17:36:11.776863 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.167:8774/\": EOF" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.027330 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.069683 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-config-data\") pod \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.069726 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-scripts\") pod \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.069748 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbmsw\" (UniqueName: \"kubernetes.io/projected/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-kube-api-access-bbmsw\") pod \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.069799 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-combined-ca-bundle\") pod \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\" (UID: \"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.093187 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-kube-api-access-bbmsw" (OuterVolumeSpecName: "kube-api-access-bbmsw") pod "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" (UID: "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6"). InnerVolumeSpecName "kube-api-access-bbmsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.108980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" (UID: "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.110606 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-scripts" (OuterVolumeSpecName: "scripts") pod "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" (UID: "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.160832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-config-data" (OuterVolumeSpecName: "config-data") pod "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" (UID: "4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.171445 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.171479 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.171494 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbmsw\" (UniqueName: \"kubernetes.io/projected/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-kube-api-access-bbmsw\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.171508 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.205438 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.226543 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.374666 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-dns-svc\") pod \"06aa7d53-c968-419c-9782-f89c704e7ebe\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.376045 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-sb\") pod \"06aa7d53-c968-419c-9782-f89c704e7ebe\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.376111 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-config\") pod \"06aa7d53-c968-419c-9782-f89c704e7ebe\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.376158 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-nb\") pod \"06aa7d53-c968-419c-9782-f89c704e7ebe\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.376178 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/06aa7d53-c968-419c-9782-f89c704e7ebe-kube-api-access-4ln5p\") pod \"06aa7d53-c968-419c-9782-f89c704e7ebe\" (UID: \"06aa7d53-c968-419c-9782-f89c704e7ebe\") " Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.381421 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06aa7d53-c968-419c-9782-f89c704e7ebe-kube-api-access-4ln5p" (OuterVolumeSpecName: "kube-api-access-4ln5p") pod "06aa7d53-c968-419c-9782-f89c704e7ebe" (UID: "06aa7d53-c968-419c-9782-f89c704e7ebe"). InnerVolumeSpecName "kube-api-access-4ln5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.428964 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-config" (OuterVolumeSpecName: "config") pod "06aa7d53-c968-419c-9782-f89c704e7ebe" (UID: "06aa7d53-c968-419c-9782-f89c704e7ebe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.430408 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "06aa7d53-c968-419c-9782-f89c704e7ebe" (UID: "06aa7d53-c968-419c-9782-f89c704e7ebe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.430912 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "06aa7d53-c968-419c-9782-f89c704e7ebe" (UID: "06aa7d53-c968-419c-9782-f89c704e7ebe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.434263 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "06aa7d53-c968-419c-9782-f89c704e7ebe" (UID: "06aa7d53-c968-419c-9782-f89c704e7ebe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.478911 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.478947 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.478961 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.478969 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/06aa7d53-c968-419c-9782-f89c704e7ebe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.478979 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ln5p\" (UniqueName: \"kubernetes.io/projected/06aa7d53-c968-419c-9782-f89c704e7ebe-kube-api-access-4ln5p\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.576369 4778 generic.go:334] "Generic (PLEG): container finished" podID="f5995b85-629b-4772-85b4-22753e1935a0" containerID="ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa" exitCode=143 Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.576454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5995b85-629b-4772-85b4-22753e1935a0","Type":"ContainerDied","Data":"ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa"} Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.579952 4778 generic.go:334] "Generic (PLEG): container finished" podID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerID="c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06" exitCode=0 Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.579998 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.580064 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" event={"ID":"06aa7d53-c968-419c-9782-f89c704e7ebe","Type":"ContainerDied","Data":"c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06"} Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.580104 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f64d5748f-f9s5c" event={"ID":"06aa7d53-c968-419c-9782-f89c704e7ebe","Type":"ContainerDied","Data":"528a3a86c7476e08868e4844a34eabb25ccaa5e71f4368b28a148261e6658708"} Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.580127 4778 scope.go:117] "RemoveContainer" containerID="c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.582774 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.583797 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7pc2n" event={"ID":"4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6","Type":"ContainerDied","Data":"98ffec1f268770cf7a881be2d8bf72a14195e30e91ec42b7612b3c2b57270099"} Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.583824 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ffec1f268770cf7a881be2d8bf72a14195e30e91ec42b7612b3c2b57270099" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.615202 4778 scope.go:117] "RemoveContainer" containerID="4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.621550 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-f9s5c"] Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.630634 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f64d5748f-f9s5c"] Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.651315 4778 scope.go:117] "RemoveContainer" containerID="c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06" Sep 30 17:36:12 crc kubenswrapper[4778]: E0930 17:36:12.651758 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06\": container with ID starting with c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06 not found: ID does not exist" containerID="c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.651803 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06"} err="failed to get container status \"c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06\": rpc error: code = NotFound desc = could not find container \"c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06\": container with ID starting with c8d82ada4adcb480b89d8301fc65bc430f2afd50c443192329988fca5fe19e06 not found: ID does not exist" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.651833 4778 scope.go:117] "RemoveContainer" containerID="4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c" Sep 30 17:36:12 crc kubenswrapper[4778]: E0930 17:36:12.652059 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c\": container with ID starting with 4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c not found: ID does not exist" containerID="4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.652077 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c"} err="failed to get container status \"4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c\": rpc error: code = NotFound desc = could not find container \"4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c\": container with ID starting with 4877f312b8679ca937fbfb68f157f8a07c9a3cfe3bcf0bfd4b950a405243180c not found: ID does not exist" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657013 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 17:36:12 crc kubenswrapper[4778]: E0930 17:36:12.657432 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerName="dnsmasq-dns" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657451 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerName="dnsmasq-dns" Sep 30 17:36:12 crc kubenswrapper[4778]: E0930 17:36:12.657479 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" containerName="nova-cell1-conductor-db-sync" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657486 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" containerName="nova-cell1-conductor-db-sync" Sep 30 17:36:12 crc kubenswrapper[4778]: E0930 17:36:12.657495 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4b7091-db1d-4b92-9009-ccafd842d405" containerName="nova-manage" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657501 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4b7091-db1d-4b92-9009-ccafd842d405" containerName="nova-manage" Sep 30 17:36:12 crc kubenswrapper[4778]: E0930 17:36:12.657518 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerName="init" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657524 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerName="init" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657696 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4b7091-db1d-4b92-9009-ccafd842d405" containerName="nova-manage" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657710 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" containerName="nova-cell1-conductor-db-sync" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.657728 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" containerName="dnsmasq-dns" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.658355 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.662354 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.672870 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.795014 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85164e57-f621-42ce-84f4-bf54119c5bb6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.795306 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrjp\" (UniqueName: \"kubernetes.io/projected/85164e57-f621-42ce-84f4-bf54119c5bb6-kube-api-access-flrjp\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.795497 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85164e57-f621-42ce-84f4-bf54119c5bb6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.898379 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85164e57-f621-42ce-84f4-bf54119c5bb6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.898480 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flrjp\" (UniqueName: \"kubernetes.io/projected/85164e57-f621-42ce-84f4-bf54119c5bb6-kube-api-access-flrjp\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.898549 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85164e57-f621-42ce-84f4-bf54119c5bb6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.903864 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85164e57-f621-42ce-84f4-bf54119c5bb6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.903982 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85164e57-f621-42ce-84f4-bf54119c5bb6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.931771 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flrjp\" (UniqueName: \"kubernetes.io/projected/85164e57-f621-42ce-84f4-bf54119c5bb6-kube-api-access-flrjp\") pod \"nova-cell1-conductor-0\" (UID: \"85164e57-f621-42ce-84f4-bf54119c5bb6\") " pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:12 crc kubenswrapper[4778]: I0930 17:36:12.988177 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:13 crc kubenswrapper[4778]: I0930 17:36:13.471393 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 17:36:13 crc kubenswrapper[4778]: W0930 17:36:13.472056 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85164e57_f621_42ce_84f4_bf54119c5bb6.slice/crio-cf5e801da0c72b81f214157d71b70e329825517bf8be31220687ad089c17bf81 WatchSource:0}: Error finding container cf5e801da0c72b81f214157d71b70e329825517bf8be31220687ad089c17bf81: Status 404 returned error can't find the container with id cf5e801da0c72b81f214157d71b70e329825517bf8be31220687ad089c17bf81 Sep 30 17:36:13 crc kubenswrapper[4778]: I0930 17:36:13.593049 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85164e57-f621-42ce-84f4-bf54119c5bb6","Type":"ContainerStarted","Data":"cf5e801da0c72b81f214157d71b70e329825517bf8be31220687ad089c17bf81"} Sep 30 17:36:13 crc kubenswrapper[4778]: I0930 17:36:13.594721 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" containerName="nova-scheduler-scheduler" containerID="cri-o://553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" gracePeriod=30 Sep 30 17:36:13 crc kubenswrapper[4778]: I0930 17:36:13.727997 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06aa7d53-c968-419c-9782-f89c704e7ebe" path="/var/lib/kubelet/pods/06aa7d53-c968-419c-9782-f89c704e7ebe/volumes" Sep 30 17:36:14 crc kubenswrapper[4778]: I0930 17:36:14.608572 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"85164e57-f621-42ce-84f4-bf54119c5bb6","Type":"ContainerStarted","Data":"b3649993515e6058275b20b56593067f0496f448701769220e21a5765a720e9b"} Sep 30 17:36:14 crc kubenswrapper[4778]: I0930 17:36:14.609134 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:14 crc kubenswrapper[4778]: I0930 17:36:14.631683 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.631583588 podStartE2EDuration="2.631583588s" podCreationTimestamp="2025-09-30 17:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:14.628835781 +0000 UTC m=+1113.618733604" watchObservedRunningTime="2025-09-30 17:36:14.631583588 +0000 UTC m=+1113.621481411" Sep 30 17:36:16 crc kubenswrapper[4778]: E0930 17:36:16.526944 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:36:16 crc kubenswrapper[4778]: E0930 17:36:16.531844 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:36:16 crc kubenswrapper[4778]: E0930 17:36:16.533749 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:36:16 crc kubenswrapper[4778]: E0930 17:36:16.533805 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" containerName="nova-scheduler-scheduler" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.414144 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.499418 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-combined-ca-bundle\") pod \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.499535 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-config-data\") pod \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.499651 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzb95\" (UniqueName: \"kubernetes.io/projected/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-kube-api-access-pzb95\") pod \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\" (UID: \"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.524463 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-kube-api-access-pzb95" (OuterVolumeSpecName: "kube-api-access-pzb95") pod "b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" (UID: "b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c"). InnerVolumeSpecName "kube-api-access-pzb95". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.544788 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" (UID: "b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.569481 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-config-data" (OuterVolumeSpecName: "config-data") pod "b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" (UID: "b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.603782 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.604088 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.604097 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzb95\" (UniqueName: \"kubernetes.io/projected/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c-kube-api-access-pzb95\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.609676 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.646231 4778 generic.go:334] "Generic (PLEG): container finished" podID="f5995b85-629b-4772-85b4-22753e1935a0" containerID="cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110" exitCode=0 Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.646316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5995b85-629b-4772-85b4-22753e1935a0","Type":"ContainerDied","Data":"cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110"} Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.646364 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5995b85-629b-4772-85b4-22753e1935a0","Type":"ContainerDied","Data":"1aba9f6dfe8e5c515c6cb30a4eb9ecf0347dd0f546303b8e55be8637d4be00e4"} Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.646385 4778 scope.go:117] "RemoveContainer" containerID="cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.646536 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.647462 4778 generic.go:334] "Generic (PLEG): container finished" podID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" exitCode=0 Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.647493 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c","Type":"ContainerDied","Data":"553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165"} Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.647520 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c","Type":"ContainerDied","Data":"0e27c6c8bc653f652cf2800a5bca33ed2b16fe71270303b9993ac44ccd7720ba"} Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.647572 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.678326 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.685290 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.692683 4778 scope.go:117] "RemoveContainer" containerID="ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.700723 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:17 crc kubenswrapper[4778]: E0930 17:36:17.701123 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-log" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701138 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-log" Sep 30 17:36:17 crc kubenswrapper[4778]: E0930 17:36:17.701149 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-api" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701156 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-api" Sep 30 17:36:17 crc kubenswrapper[4778]: E0930 17:36:17.701180 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" containerName="nova-scheduler-scheduler" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701186 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" containerName="nova-scheduler-scheduler" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701343 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-log" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701360 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5995b85-629b-4772-85b4-22753e1935a0" containerName="nova-api-api" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701373 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" containerName="nova-scheduler-scheduler" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.701971 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.704677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2tqw\" (UniqueName: \"kubernetes.io/projected/f5995b85-629b-4772-85b4-22753e1935a0-kube-api-access-p2tqw\") pod \"f5995b85-629b-4772-85b4-22753e1935a0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.704715 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-combined-ca-bundle\") pod \"f5995b85-629b-4772-85b4-22753e1935a0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.704761 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-config-data\") pod \"f5995b85-629b-4772-85b4-22753e1935a0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.704789 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5995b85-629b-4772-85b4-22753e1935a0-logs\") pod \"f5995b85-629b-4772-85b4-22753e1935a0\" (UID: \"f5995b85-629b-4772-85b4-22753e1935a0\") " Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.705556 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5995b85-629b-4772-85b4-22753e1935a0-logs" (OuterVolumeSpecName: "logs") pod "f5995b85-629b-4772-85b4-22753e1935a0" (UID: "f5995b85-629b-4772-85b4-22753e1935a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.709188 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5995b85-629b-4772-85b4-22753e1935a0-kube-api-access-p2tqw" (OuterVolumeSpecName: "kube-api-access-p2tqw") pod "f5995b85-629b-4772-85b4-22753e1935a0" (UID: "f5995b85-629b-4772-85b4-22753e1935a0"). InnerVolumeSpecName "kube-api-access-p2tqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.711299 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.717000 4778 scope.go:117] "RemoveContainer" containerID="cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110" Sep 30 17:36:17 crc kubenswrapper[4778]: E0930 17:36:17.717483 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110\": container with ID starting with cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110 not found: ID does not exist" containerID="cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.717537 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110"} err="failed to get container status \"cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110\": rpc error: code = NotFound desc = could not find container \"cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110\": container with ID starting with cb74dd3b0b5f71e53ede8545cdecdb89065975b52a7b9d34259c83695fb82110 not found: ID does not exist" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.717572 4778 scope.go:117] "RemoveContainer" containerID="ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa" Sep 30 17:36:17 crc kubenswrapper[4778]: E0930 17:36:17.717874 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa\": container with ID starting with ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa not found: ID does not exist" containerID="ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.717924 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa"} err="failed to get container status \"ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa\": rpc error: code = NotFound desc = could not find container \"ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa\": container with ID starting with ceabbf2a59c1b1361e0f94cf857f20f528f70723f191b0392d8e54bcf17807aa not found: ID does not exist" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.717940 4778 scope.go:117] "RemoveContainer" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.732419 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c" path="/var/lib/kubelet/pods/b25e8d42-7ce4-4dd8-9cec-5ff75e29ef4c/volumes" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.734990 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-config-data" (OuterVolumeSpecName: "config-data") pod "f5995b85-629b-4772-85b4-22753e1935a0" (UID: "f5995b85-629b-4772-85b4-22753e1935a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.735341 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.741458 4778 scope.go:117] "RemoveContainer" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" Sep 30 17:36:17 crc kubenswrapper[4778]: E0930 17:36:17.741979 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165\": container with ID starting with 553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165 not found: ID does not exist" containerID="553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.742020 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165"} err="failed to get container status \"553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165\": rpc error: code = NotFound desc = could not find container \"553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165\": container with ID starting with 553928e888fc4423bf37e8a7d0b94370bd06d41e5e7486232c172115fa61f165 not found: ID does not exist" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.750097 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5995b85-629b-4772-85b4-22753e1935a0" (UID: "f5995b85-629b-4772-85b4-22753e1935a0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.809658 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-config-data\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.809798 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.809893 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6gqr\" (UniqueName: \"kubernetes.io/projected/03376d6a-71aa-410f-9c28-d9beeb68dc6c-kube-api-access-c6gqr\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.810018 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.810039 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5995b85-629b-4772-85b4-22753e1935a0-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.810056 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2tqw\" (UniqueName: \"kubernetes.io/projected/f5995b85-629b-4772-85b4-22753e1935a0-kube-api-access-p2tqw\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.810074 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5995b85-629b-4772-85b4-22753e1935a0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.911955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-config-data\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.912051 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.912122 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6gqr\" (UniqueName: \"kubernetes.io/projected/03376d6a-71aa-410f-9c28-d9beeb68dc6c-kube-api-access-c6gqr\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.916082 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.916219 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-config-data\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:17 crc kubenswrapper[4778]: I0930 17:36:17.928664 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6gqr\" (UniqueName: \"kubernetes.io/projected/03376d6a-71aa-410f-9c28-d9beeb68dc6c-kube-api-access-c6gqr\") pod \"nova-scheduler-0\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " pod="openstack/nova-scheduler-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.024410 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.031704 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.040024 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.049425 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.051035 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.055069 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.069300 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.116302 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.116384 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-logs\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.116636 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-config-data\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.116797 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj278\" (UniqueName: \"kubernetes.io/projected/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-kube-api-access-zj278\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.219369 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-config-data\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.219490 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj278\" (UniqueName: \"kubernetes.io/projected/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-kube-api-access-zj278\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.219567 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.219638 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-logs\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.220289 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-logs\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.229098 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-config-data\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.238562 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.239042 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj278\" (UniqueName: \"kubernetes.io/projected/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-kube-api-access-zj278\") pod \"nova-api-0\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.311772 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:18 crc kubenswrapper[4778]: W0930 17:36:18.322020 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03376d6a_71aa_410f_9c28_d9beeb68dc6c.slice/crio-5b9b5baa43025c14e112e7bbbba33b7bbc9c42f91c709108de639d9313d1c22a WatchSource:0}: Error finding container 5b9b5baa43025c14e112e7bbbba33b7bbc9c42f91c709108de639d9313d1c22a: Status 404 returned error can't find the container with id 5b9b5baa43025c14e112e7bbbba33b7bbc9c42f91c709108de639d9313d1c22a Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.375480 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.659690 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03376d6a-71aa-410f-9c28-d9beeb68dc6c","Type":"ContainerStarted","Data":"64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc"} Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.659725 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03376d6a-71aa-410f-9c28-d9beeb68dc6c","Type":"ContainerStarted","Data":"5b9b5baa43025c14e112e7bbbba33b7bbc9c42f91c709108de639d9313d1c22a"} Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.678259 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.678234724 podStartE2EDuration="1.678234724s" podCreationTimestamp="2025-09-30 17:36:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:18.672773702 +0000 UTC m=+1117.662671545" watchObservedRunningTime="2025-09-30 17:36:18.678234724 +0000 UTC m=+1117.668132557" Sep 30 17:36:18 crc kubenswrapper[4778]: I0930 17:36:18.873557 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:18 crc kubenswrapper[4778]: W0930 17:36:18.873729 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e05edb_cb63_4b60_95fa_2735a9b2d9f1.slice/crio-87a7eb0dc2d6bcc3d9c1e261d15983af1aec0a1d6f5960f179c8f926a6e940a1 WatchSource:0}: Error finding container 87a7eb0dc2d6bcc3d9c1e261d15983af1aec0a1d6f5960f179c8f926a6e940a1: Status 404 returned error can't find the container with id 87a7eb0dc2d6bcc3d9c1e261d15983af1aec0a1d6f5960f179c8f926a6e940a1 Sep 30 17:36:19 crc kubenswrapper[4778]: I0930 17:36:19.672662 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69e05edb-cb63-4b60-95fa-2735a9b2d9f1","Type":"ContainerStarted","Data":"e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0"} Sep 30 17:36:19 crc kubenswrapper[4778]: I0930 17:36:19.672890 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69e05edb-cb63-4b60-95fa-2735a9b2d9f1","Type":"ContainerStarted","Data":"4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e"} Sep 30 17:36:19 crc kubenswrapper[4778]: I0930 17:36:19.672901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69e05edb-cb63-4b60-95fa-2735a9b2d9f1","Type":"ContainerStarted","Data":"87a7eb0dc2d6bcc3d9c1e261d15983af1aec0a1d6f5960f179c8f926a6e940a1"} Sep 30 17:36:19 crc kubenswrapper[4778]: I0930 17:36:19.692739 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.69272149 podStartE2EDuration="1.69272149s" podCreationTimestamp="2025-09-30 17:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:19.688333492 +0000 UTC m=+1118.678231315" watchObservedRunningTime="2025-09-30 17:36:19.69272149 +0000 UTC m=+1118.682619293" Sep 30 17:36:19 crc kubenswrapper[4778]: I0930 17:36:19.727429 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5995b85-629b-4772-85b4-22753e1935a0" path="/var/lib/kubelet/pods/f5995b85-629b-4772-85b4-22753e1935a0/volumes" Sep 30 17:36:23 crc kubenswrapper[4778]: I0930 17:36:23.032900 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 17:36:23 crc kubenswrapper[4778]: I0930 17:36:23.033320 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 17:36:28 crc kubenswrapper[4778]: I0930 17:36:28.032074 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 17:36:28 crc kubenswrapper[4778]: I0930 17:36:28.060536 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 17:36:28 crc kubenswrapper[4778]: I0930 17:36:28.376142 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:36:28 crc kubenswrapper[4778]: I0930 17:36:28.376220 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:36:28 crc kubenswrapper[4778]: I0930 17:36:28.811426 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 17:36:29 crc kubenswrapper[4778]: I0930 17:36:29.458798 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:36:29 crc kubenswrapper[4778]: I0930 17:36:29.458841 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.175:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.836391 4778 generic.go:334] "Generic (PLEG): container finished" podID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerID="30ec4090d9d28a7ce8850595f63fb25489ef44a273ee5eaad477b8790f6b8d54" exitCode=137 Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.837006 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b","Type":"ContainerDied","Data":"30ec4090d9d28a7ce8850595f63fb25489ef44a273ee5eaad477b8790f6b8d54"} Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.838887 4778 generic.go:334] "Generic (PLEG): container finished" podID="9171d6da-2c81-45f9-a295-92b42f5a6c09" containerID="499ca2af5f7a85a01e8327be79b09172d708b959d5612515a4694028383701a3" exitCode=137 Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.838916 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9171d6da-2c81-45f9-a295-92b42f5a6c09","Type":"ContainerDied","Data":"499ca2af5f7a85a01e8327be79b09172d708b959d5612515a4694028383701a3"} Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.950193 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.961739 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.983971 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-logs\") pod \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.984106 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-config-data\") pod \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.984228 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxhjk\" (UniqueName: \"kubernetes.io/projected/9171d6da-2c81-45f9-a295-92b42f5a6c09-kube-api-access-lxhjk\") pod \"9171d6da-2c81-45f9-a295-92b42f5a6c09\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.984306 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7kwr\" (UniqueName: \"kubernetes.io/projected/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-kube-api-access-s7kwr\") pod \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.984369 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-combined-ca-bundle\") pod \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\" (UID: \"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.984406 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-config-data\") pod \"9171d6da-2c81-45f9-a295-92b42f5a6c09\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.984491 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-combined-ca-bundle\") pod \"9171d6da-2c81-45f9-a295-92b42f5a6c09\" (UID: \"9171d6da-2c81-45f9-a295-92b42f5a6c09\") " Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.987527 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-logs" (OuterVolumeSpecName: "logs") pod "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" (UID: "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:36:35 crc kubenswrapper[4778]: I0930 17:36:35.999887 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-kube-api-access-s7kwr" (OuterVolumeSpecName: "kube-api-access-s7kwr") pod "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" (UID: "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b"). InnerVolumeSpecName "kube-api-access-s7kwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.006232 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9171d6da-2c81-45f9-a295-92b42f5a6c09-kube-api-access-lxhjk" (OuterVolumeSpecName: "kube-api-access-lxhjk") pod "9171d6da-2c81-45f9-a295-92b42f5a6c09" (UID: "9171d6da-2c81-45f9-a295-92b42f5a6c09"). InnerVolumeSpecName "kube-api-access-lxhjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.041995 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" (UID: "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.045569 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-config-data" (OuterVolumeSpecName: "config-data") pod "9171d6da-2c81-45f9-a295-92b42f5a6c09" (UID: "9171d6da-2c81-45f9-a295-92b42f5a6c09"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.048765 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9171d6da-2c81-45f9-a295-92b42f5a6c09" (UID: "9171d6da-2c81-45f9-a295-92b42f5a6c09"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.061789 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-config-data" (OuterVolumeSpecName: "config-data") pod "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" (UID: "54b3d113-98b5-4201-bef7-a7c3d7fe1c5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087416 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087862 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087878 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087915 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxhjk\" (UniqueName: \"kubernetes.io/projected/9171d6da-2c81-45f9-a295-92b42f5a6c09-kube-api-access-lxhjk\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087933 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7kwr\" (UniqueName: \"kubernetes.io/projected/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-kube-api-access-s7kwr\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087944 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.087957 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9171d6da-2c81-45f9-a295-92b42f5a6c09-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.851673 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"54b3d113-98b5-4201-bef7-a7c3d7fe1c5b","Type":"ContainerDied","Data":"b91393f58a3da5e82daa17a5dc4f0e06fe41ab81e8d0c8a701b6f1b2a818ed86"} Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.851726 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.851749 4778 scope.go:117] "RemoveContainer" containerID="30ec4090d9d28a7ce8850595f63fb25489ef44a273ee5eaad477b8790f6b8d54" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.854634 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"9171d6da-2c81-45f9-a295-92b42f5a6c09","Type":"ContainerDied","Data":"7a08f3fc8ef43cbdf84a0fb09ae769009f27cecf66344b00d9e919b558572f12"} Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.854853 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.888250 4778 scope.go:117] "RemoveContainer" containerID="877f10dffd5bf369aef3fa3bc28f27b41aaf09ced43615e8ba750ad4489869a9" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.893871 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.916943 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.932069 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.947196 4778 scope.go:117] "RemoveContainer" containerID="499ca2af5f7a85a01e8327be79b09172d708b959d5612515a4694028383701a3" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.971427 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.986781 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:36 crc kubenswrapper[4778]: E0930 17:36:36.987227 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-log" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.987245 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-log" Sep 30 17:36:36 crc kubenswrapper[4778]: E0930 17:36:36.987277 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9171d6da-2c81-45f9-a295-92b42f5a6c09" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.987283 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9171d6da-2c81-45f9-a295-92b42f5a6c09" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 17:36:36 crc kubenswrapper[4778]: E0930 17:36:36.987293 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-metadata" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.987300 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-metadata" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.987457 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9171d6da-2c81-45f9-a295-92b42f5a6c09" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.987473 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-metadata" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.987486 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" containerName="nova-metadata-log" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.989106 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.991330 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.991716 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:36:36 crc kubenswrapper[4778]: I0930 17:36:36.996959 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.005630 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.007789 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.008204 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.008392 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.009758 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.019030 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023281 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023346 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023459 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023581 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-config-data\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023663 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2072ccae-3380-4294-915d-3e3bba81b1f7-logs\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023917 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.023971 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqdjk\" (UniqueName: \"kubernetes.io/projected/2072ccae-3380-4294-915d-3e3bba81b1f7-kube-api-access-cqdjk\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.024035 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7n7h\" (UniqueName: \"kubernetes.io/projected/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-kube-api-access-x7n7h\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.024101 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.024126 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126242 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-config-data\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126305 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2072ccae-3380-4294-915d-3e3bba81b1f7-logs\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126389 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126425 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqdjk\" (UniqueName: \"kubernetes.io/projected/2072ccae-3380-4294-915d-3e3bba81b1f7-kube-api-access-cqdjk\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126459 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7n7h\" (UniqueName: \"kubernetes.io/projected/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-kube-api-access-x7n7h\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126522 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.126764 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.127190 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2072ccae-3380-4294-915d-3e3bba81b1f7-logs\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.127231 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.127259 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.127280 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.131958 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.132681 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.132778 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.133038 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.133552 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.139381 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-config-data\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.142895 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.145139 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7n7h\" (UniqueName: \"kubernetes.io/projected/11bab32a-a35b-4e80-8eec-fc3d8a8f16f7-kube-api-access-x7n7h\") pod \"nova-cell1-novncproxy-0\" (UID: \"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.152722 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqdjk\" (UniqueName: \"kubernetes.io/projected/2072ccae-3380-4294-915d-3e3bba81b1f7-kube-api-access-cqdjk\") pod \"nova-metadata-0\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.315899 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.328289 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.724249 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b3d113-98b5-4201-bef7-a7c3d7fe1c5b" path="/var/lib/kubelet/pods/54b3d113-98b5-4201-bef7-a7c3d7fe1c5b/volumes" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.725157 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9171d6da-2c81-45f9-a295-92b42f5a6c09" path="/var/lib/kubelet/pods/9171d6da-2c81-45f9-a295-92b42f5a6c09/volumes" Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.867795 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:37 crc kubenswrapper[4778]: W0930 17:36:37.876679 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2072ccae_3380_4294_915d_3e3bba81b1f7.slice/crio-7c96daf149ccc2a479e9e85cebcfde7117216eddc8150a8888a6273a4f133402 WatchSource:0}: Error finding container 7c96daf149ccc2a479e9e85cebcfde7117216eddc8150a8888a6273a4f133402: Status 404 returned error can't find the container with id 7c96daf149ccc2a479e9e85cebcfde7117216eddc8150a8888a6273a4f133402 Sep 30 17:36:37 crc kubenswrapper[4778]: I0930 17:36:37.957005 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 17:36:37 crc kubenswrapper[4778]: W0930 17:36:37.959680 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11bab32a_a35b_4e80_8eec_fc3d8a8f16f7.slice/crio-6066131d71246b6a86798786175ee7883bad62cada54898089ccd4158d3874db WatchSource:0}: Error finding container 6066131d71246b6a86798786175ee7883bad62cada54898089ccd4158d3874db: Status 404 returned error can't find the container with id 6066131d71246b6a86798786175ee7883bad62cada54898089ccd4158d3874db Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.383373 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.383932 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.384031 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.387170 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.876282 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7","Type":"ContainerStarted","Data":"5d3712b0b9d5c9feb5e3c701c167cd9d8d4e7cabd5e7cecc5badf9fb93a058b7"} Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.876593 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"11bab32a-a35b-4e80-8eec-fc3d8a8f16f7","Type":"ContainerStarted","Data":"6066131d71246b6a86798786175ee7883bad62cada54898089ccd4158d3874db"} Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.880769 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2072ccae-3380-4294-915d-3e3bba81b1f7","Type":"ContainerStarted","Data":"1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6"} Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.880806 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2072ccae-3380-4294-915d-3e3bba81b1f7","Type":"ContainerStarted","Data":"cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76"} Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.880822 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.880834 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2072ccae-3380-4294-915d-3e3bba81b1f7","Type":"ContainerStarted","Data":"7c96daf149ccc2a479e9e85cebcfde7117216eddc8150a8888a6273a4f133402"} Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.884125 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.906408 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.906385288 podStartE2EDuration="2.906385288s" podCreationTimestamp="2025-09-30 17:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:38.899707268 +0000 UTC m=+1137.889605071" watchObservedRunningTime="2025-09-30 17:36:38.906385288 +0000 UTC m=+1137.896283101" Sep 30 17:36:38 crc kubenswrapper[4778]: I0930 17:36:38.947150 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.947132135 podStartE2EDuration="2.947132135s" podCreationTimestamp="2025-09-30 17:36:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:38.942152838 +0000 UTC m=+1137.932050651" watchObservedRunningTime="2025-09-30 17:36:38.947132135 +0000 UTC m=+1137.937029928" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.158673 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-778d8bb9d7-lvxlw"] Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.160660 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.173510 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778d8bb9d7-lvxlw"] Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.305025 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-dns-svc\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.305097 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5f58\" (UniqueName: \"kubernetes.io/projected/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-kube-api-access-j5f58\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.305218 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-ovsdbserver-nb\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.305237 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-ovsdbserver-sb\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.305345 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-config\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.407262 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5f58\" (UniqueName: \"kubernetes.io/projected/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-kube-api-access-j5f58\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.407332 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-ovsdbserver-nb\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.407353 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-ovsdbserver-sb\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.407462 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-config\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.407483 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-dns-svc\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.408457 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-dns-svc\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.408472 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-config\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.408491 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-ovsdbserver-sb\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.408970 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-ovsdbserver-nb\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.437407 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5f58\" (UniqueName: \"kubernetes.io/projected/ae9e27fc-d20d-4adc-9a77-8a29bb1b262b-kube-api-access-j5f58\") pod \"dnsmasq-dns-778d8bb9d7-lvxlw\" (UID: \"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b\") " pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:39 crc kubenswrapper[4778]: I0930 17:36:39.488423 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:40 crc kubenswrapper[4778]: I0930 17:36:40.014937 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778d8bb9d7-lvxlw"] Sep 30 17:36:40 crc kubenswrapper[4778]: W0930 17:36:40.018796 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9e27fc_d20d_4adc_9a77_8a29bb1b262b.slice/crio-2f4048a6831bffabcefb680a9e359ee64784ac1270481461a4ebd0d739d62f71 WatchSource:0}: Error finding container 2f4048a6831bffabcefb680a9e359ee64784ac1270481461a4ebd0d739d62f71: Status 404 returned error can't find the container with id 2f4048a6831bffabcefb680a9e359ee64784ac1270481461a4ebd0d739d62f71 Sep 30 17:36:40 crc kubenswrapper[4778]: I0930 17:36:40.897588 4778 generic.go:334] "Generic (PLEG): container finished" podID="ae9e27fc-d20d-4adc-9a77-8a29bb1b262b" containerID="272606bddca2ea04d81a61d580450dc4624d27e8721ebec0a5397d1908e3fea5" exitCode=0 Sep 30 17:36:40 crc kubenswrapper[4778]: I0930 17:36:40.899088 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" event={"ID":"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b","Type":"ContainerDied","Data":"272606bddca2ea04d81a61d580450dc4624d27e8721ebec0a5397d1908e3fea5"} Sep 30 17:36:40 crc kubenswrapper[4778]: I0930 17:36:40.899239 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" event={"ID":"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b","Type":"ContainerStarted","Data":"2f4048a6831bffabcefb680a9e359ee64784ac1270481461a4ebd0d739d62f71"} Sep 30 17:36:41 crc kubenswrapper[4778]: I0930 17:36:41.553142 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:41 crc kubenswrapper[4778]: I0930 17:36:41.910001 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" event={"ID":"ae9e27fc-d20d-4adc-9a77-8a29bb1b262b","Type":"ContainerStarted","Data":"827acec03905a06cf139992b1f12490d6d35000d13844336ad5e97ae1d045530"} Sep 30 17:36:41 crc kubenswrapper[4778]: I0930 17:36:41.910331 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-api" containerID="cri-o://e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0" gracePeriod=30 Sep 30 17:36:41 crc kubenswrapper[4778]: I0930 17:36:41.910458 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:41 crc kubenswrapper[4778]: I0930 17:36:41.910353 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-log" containerID="cri-o://4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e" gracePeriod=30 Sep 30 17:36:41 crc kubenswrapper[4778]: I0930 17:36:41.930875 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" podStartSLOduration=2.930856228 podStartE2EDuration="2.930856228s" podCreationTimestamp="2025-09-30 17:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:41.927050837 +0000 UTC m=+1140.916948640" watchObservedRunningTime="2025-09-30 17:36:41.930856228 +0000 UTC m=+1140.920754031" Sep 30 17:36:42 crc kubenswrapper[4778]: I0930 17:36:42.318066 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:36:42 crc kubenswrapper[4778]: I0930 17:36:42.318944 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:36:42 crc kubenswrapper[4778]: I0930 17:36:42.328737 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:42 crc kubenswrapper[4778]: I0930 17:36:42.921107 4778 generic.go:334] "Generic (PLEG): container finished" podID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerID="4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e" exitCode=143 Sep 30 17:36:42 crc kubenswrapper[4778]: I0930 17:36:42.921211 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69e05edb-cb63-4b60-95fa-2735a9b2d9f1","Type":"ContainerDied","Data":"4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e"} Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.537258 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.657998 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-combined-ca-bundle\") pod \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.658090 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-logs\") pod \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.658135 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-config-data\") pod \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.658220 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj278\" (UniqueName: \"kubernetes.io/projected/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-kube-api-access-zj278\") pod \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\" (UID: \"69e05edb-cb63-4b60-95fa-2735a9b2d9f1\") " Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.660096 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-logs" (OuterVolumeSpecName: "logs") pod "69e05edb-cb63-4b60-95fa-2735a9b2d9f1" (UID: "69e05edb-cb63-4b60-95fa-2735a9b2d9f1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.672800 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-kube-api-access-zj278" (OuterVolumeSpecName: "kube-api-access-zj278") pod "69e05edb-cb63-4b60-95fa-2735a9b2d9f1" (UID: "69e05edb-cb63-4b60-95fa-2735a9b2d9f1"). InnerVolumeSpecName "kube-api-access-zj278". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.692773 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69e05edb-cb63-4b60-95fa-2735a9b2d9f1" (UID: "69e05edb-cb63-4b60-95fa-2735a9b2d9f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.696804 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-config-data" (OuterVolumeSpecName: "config-data") pod "69e05edb-cb63-4b60-95fa-2735a9b2d9f1" (UID: "69e05edb-cb63-4b60-95fa-2735a9b2d9f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.760386 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.760431 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.760443 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.760455 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj278\" (UniqueName: \"kubernetes.io/projected/69e05edb-cb63-4b60-95fa-2735a9b2d9f1-kube-api-access-zj278\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.953214 4778 generic.go:334] "Generic (PLEG): container finished" podID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerID="e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0" exitCode=0 Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.953280 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.953316 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69e05edb-cb63-4b60-95fa-2735a9b2d9f1","Type":"ContainerDied","Data":"e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0"} Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.953723 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"69e05edb-cb63-4b60-95fa-2735a9b2d9f1","Type":"ContainerDied","Data":"87a7eb0dc2d6bcc3d9c1e261d15983af1aec0a1d6f5960f179c8f926a6e940a1"} Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.953769 4778 scope.go:117] "RemoveContainer" containerID="e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0" Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.974074 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.982530 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:45 crc kubenswrapper[4778]: I0930 17:36:45.988894 4778 scope.go:117] "RemoveContainer" containerID="4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.000411 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:46 crc kubenswrapper[4778]: E0930 17:36:46.000908 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-log" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.000929 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-log" Sep 30 17:36:46 crc kubenswrapper[4778]: E0930 17:36:46.000945 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-api" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.000951 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-api" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.001123 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-api" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.001146 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" containerName="nova-api-log" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.003774 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.007299 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.007484 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.007736 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.016721 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.040129 4778 scope.go:117] "RemoveContainer" containerID="e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0" Sep 30 17:36:46 crc kubenswrapper[4778]: E0930 17:36:46.040557 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0\": container with ID starting with e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0 not found: ID does not exist" containerID="e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.040586 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0"} err="failed to get container status \"e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0\": rpc error: code = NotFound desc = could not find container \"e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0\": container with ID starting with e27fdd90a8541c5a8a6157f484c653634b13583ca4443b9503d65aa855e77ab0 not found: ID does not exist" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.040653 4778 scope.go:117] "RemoveContainer" containerID="4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e" Sep 30 17:36:46 crc kubenswrapper[4778]: E0930 17:36:46.040906 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e\": container with ID starting with 4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e not found: ID does not exist" containerID="4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.040930 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e"} err="failed to get container status \"4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e\": rpc error: code = NotFound desc = could not find container \"4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e\": container with ID starting with 4d8d206c647d645fa112d8fd86e9340fc355ca863a5eeb9d051345ce1dfa7d9e not found: ID does not exist" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.167045 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.167109 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.167147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdqsb\" (UniqueName: \"kubernetes.io/projected/1a6cd22c-2090-4cc4-ac05-12ea37404383-kube-api-access-qdqsb\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.167187 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.167308 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-config-data\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.167416 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd22c-2090-4cc4-ac05-12ea37404383-logs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.268861 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-config-data\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.268977 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd22c-2090-4cc4-ac05-12ea37404383-logs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.269044 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.269568 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd22c-2090-4cc4-ac05-12ea37404383-logs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.269998 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.270163 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdqsb\" (UniqueName: \"kubernetes.io/projected/1a6cd22c-2090-4cc4-ac05-12ea37404383-kube-api-access-qdqsb\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.270335 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.273738 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-internal-tls-certs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.275906 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.276471 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-public-tls-certs\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.276796 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-config-data\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.301740 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdqsb\" (UniqueName: \"kubernetes.io/projected/1a6cd22c-2090-4cc4-ac05-12ea37404383-kube-api-access-qdqsb\") pod \"nova-api-0\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.340139 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.819838 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:46 crc kubenswrapper[4778]: I0930 17:36:46.962513 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a6cd22c-2090-4cc4-ac05-12ea37404383","Type":"ContainerStarted","Data":"67d69112cd59b359009aa7ecb5204361da2e4bdb9d09e0d47ce3a30050720261"} Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.316347 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.317001 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.329845 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.349866 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.729111 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e05edb-cb63-4b60-95fa-2735a9b2d9f1" path="/var/lib/kubelet/pods/69e05edb-cb63-4b60-95fa-2735a9b2d9f1/volumes" Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.973712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a6cd22c-2090-4cc4-ac05-12ea37404383","Type":"ContainerStarted","Data":"88891d050159c2339358a233f6b0de5f5824a47b1f78435b4ca2b78a9d24be88"} Sep 30 17:36:47 crc kubenswrapper[4778]: I0930 17:36:47.973768 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a6cd22c-2090-4cc4-ac05-12ea37404383","Type":"ContainerStarted","Data":"f559aa002ba988092e649881bf7cd55402dca4a53e799efe53f1b8acd7e9d7b6"} Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.016533 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.016512433 podStartE2EDuration="3.016512433s" podCreationTimestamp="2025-09-30 17:36:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:48.011606588 +0000 UTC m=+1147.001504471" watchObservedRunningTime="2025-09-30 17:36:48.016512433 +0000 UTC m=+1147.006410236" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.027571 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.256835 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-mt5l7"] Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.258376 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.261945 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.262273 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.267441 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mt5l7"] Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.327917 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.327915 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.176:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.425414 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-config-data\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.425539 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6trvs\" (UniqueName: \"kubernetes.io/projected/7268e8e0-16cf-4b12-8593-e579df617a0e-kube-api-access-6trvs\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.425766 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-scripts\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.425965 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.550170 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.550358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-config-data\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.550446 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6trvs\" (UniqueName: \"kubernetes.io/projected/7268e8e0-16cf-4b12-8593-e579df617a0e-kube-api-access-6trvs\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.551678 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-scripts\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.569493 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-config-data\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.569636 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.569989 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-scripts\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.582890 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6trvs\" (UniqueName: \"kubernetes.io/projected/7268e8e0-16cf-4b12-8593-e579df617a0e-kube-api-access-6trvs\") pod \"nova-cell1-cell-mapping-mt5l7\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:48 crc kubenswrapper[4778]: I0930 17:36:48.879187 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:49 crc kubenswrapper[4778]: I0930 17:36:49.340890 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-mt5l7"] Sep 30 17:36:49 crc kubenswrapper[4778]: I0930 17:36:49.490681 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-778d8bb9d7-lvxlw" Sep 30 17:36:49 crc kubenswrapper[4778]: I0930 17:36:49.555894 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745f868dcf-njjc6"] Sep 30 17:36:49 crc kubenswrapper[4778]: I0930 17:36:49.556381 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerName="dnsmasq-dns" containerID="cri-o://9399b08040b6e3234013278aa69986bf9af8df5134d42c7c0dfbfb4397c02dd0" gracePeriod=10 Sep 30 17:36:49 crc kubenswrapper[4778]: I0930 17:36:49.999545 4778 generic.go:334] "Generic (PLEG): container finished" podID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerID="9399b08040b6e3234013278aa69986bf9af8df5134d42c7c0dfbfb4397c02dd0" exitCode=0 Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:49.999783 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" event={"ID":"2f31c368-4084-49e9-8d47-c7eb8541b0d2","Type":"ContainerDied","Data":"9399b08040b6e3234013278aa69986bf9af8df5134d42c7c0dfbfb4397c02dd0"} Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:49.999916 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" event={"ID":"2f31c368-4084-49e9-8d47-c7eb8541b0d2","Type":"ContainerDied","Data":"fdca139a5d23e24a602d9a24c1b9642a0b24ff7d0d88aa2ad9eb13aa8893df12"} Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:49.999931 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fdca139a5d23e24a602d9a24c1b9642a0b24ff7d0d88aa2ad9eb13aa8893df12" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.001748 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mt5l7" event={"ID":"7268e8e0-16cf-4b12-8593-e579df617a0e","Type":"ContainerStarted","Data":"f7c488c44759fad25c0dc4cf6e61440cc324771c957d9535029796d90cfbe524"} Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.001786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mt5l7" event={"ID":"7268e8e0-16cf-4b12-8593-e579df617a0e","Type":"ContainerStarted","Data":"1f0e4a4e35a0bf4cf659e530b9a849541d678d553e636bd8c96c34d2bf97ee68"} Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.030080 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-mt5l7" podStartSLOduration=2.030060278 podStartE2EDuration="2.030060278s" podCreationTimestamp="2025-09-30 17:36:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:36:50.017657596 +0000 UTC m=+1149.007555419" watchObservedRunningTime="2025-09-30 17:36:50.030060278 +0000 UTC m=+1149.019958081" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.048585 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.078667 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-dns-svc\") pod \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.078986 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb2vd\" (UniqueName: \"kubernetes.io/projected/2f31c368-4084-49e9-8d47-c7eb8541b0d2-kube-api-access-nb2vd\") pod \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.079133 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-sb\") pod \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.079274 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-nb\") pod \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.079391 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-config\") pod \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\" (UID: \"2f31c368-4084-49e9-8d47-c7eb8541b0d2\") " Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.091280 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f31c368-4084-49e9-8d47-c7eb8541b0d2-kube-api-access-nb2vd" (OuterVolumeSpecName: "kube-api-access-nb2vd") pod "2f31c368-4084-49e9-8d47-c7eb8541b0d2" (UID: "2f31c368-4084-49e9-8d47-c7eb8541b0d2"). InnerVolumeSpecName "kube-api-access-nb2vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.125832 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f31c368-4084-49e9-8d47-c7eb8541b0d2" (UID: "2f31c368-4084-49e9-8d47-c7eb8541b0d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.127085 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f31c368-4084-49e9-8d47-c7eb8541b0d2" (UID: "2f31c368-4084-49e9-8d47-c7eb8541b0d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.135836 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f31c368-4084-49e9-8d47-c7eb8541b0d2" (UID: "2f31c368-4084-49e9-8d47-c7eb8541b0d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.145295 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-config" (OuterVolumeSpecName: "config") pod "2f31c368-4084-49e9-8d47-c7eb8541b0d2" (UID: "2f31c368-4084-49e9-8d47-c7eb8541b0d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.182317 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.182575 4778 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-config\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.182675 4778 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.182738 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb2vd\" (UniqueName: \"kubernetes.io/projected/2f31c368-4084-49e9-8d47-c7eb8541b0d2-kube-api-access-nb2vd\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:50 crc kubenswrapper[4778]: I0930 17:36:50.182797 4778 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f31c368-4084-49e9-8d47-c7eb8541b0d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:51 crc kubenswrapper[4778]: I0930 17:36:51.009573 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-745f868dcf-njjc6" Sep 30 17:36:51 crc kubenswrapper[4778]: I0930 17:36:51.047648 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-745f868dcf-njjc6"] Sep 30 17:36:51 crc kubenswrapper[4778]: I0930 17:36:51.072972 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-745f868dcf-njjc6"] Sep 30 17:36:51 crc kubenswrapper[4778]: I0930 17:36:51.736537 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" path="/var/lib/kubelet/pods/2f31c368-4084-49e9-8d47-c7eb8541b0d2/volumes" Sep 30 17:36:55 crc kubenswrapper[4778]: I0930 17:36:55.068027 4778 generic.go:334] "Generic (PLEG): container finished" podID="7268e8e0-16cf-4b12-8593-e579df617a0e" containerID="f7c488c44759fad25c0dc4cf6e61440cc324771c957d9535029796d90cfbe524" exitCode=0 Sep 30 17:36:55 crc kubenswrapper[4778]: I0930 17:36:55.068130 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mt5l7" event={"ID":"7268e8e0-16cf-4b12-8593-e579df617a0e","Type":"ContainerDied","Data":"f7c488c44759fad25c0dc4cf6e61440cc324771c957d9535029796d90cfbe524"} Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.341103 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.341761 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.425279 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.604740 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-config-data\") pod \"7268e8e0-16cf-4b12-8593-e579df617a0e\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.604884 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6trvs\" (UniqueName: \"kubernetes.io/projected/7268e8e0-16cf-4b12-8593-e579df617a0e-kube-api-access-6trvs\") pod \"7268e8e0-16cf-4b12-8593-e579df617a0e\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.604918 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-scripts\") pod \"7268e8e0-16cf-4b12-8593-e579df617a0e\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.604989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-combined-ca-bundle\") pod \"7268e8e0-16cf-4b12-8593-e579df617a0e\" (UID: \"7268e8e0-16cf-4b12-8593-e579df617a0e\") " Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.612970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7268e8e0-16cf-4b12-8593-e579df617a0e-kube-api-access-6trvs" (OuterVolumeSpecName: "kube-api-access-6trvs") pod "7268e8e0-16cf-4b12-8593-e579df617a0e" (UID: "7268e8e0-16cf-4b12-8593-e579df617a0e"). InnerVolumeSpecName "kube-api-access-6trvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.621027 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-scripts" (OuterVolumeSpecName: "scripts") pod "7268e8e0-16cf-4b12-8593-e579df617a0e" (UID: "7268e8e0-16cf-4b12-8593-e579df617a0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.642631 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7268e8e0-16cf-4b12-8593-e579df617a0e" (UID: "7268e8e0-16cf-4b12-8593-e579df617a0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.645427 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-config-data" (OuterVolumeSpecName: "config-data") pod "7268e8e0-16cf-4b12-8593-e579df617a0e" (UID: "7268e8e0-16cf-4b12-8593-e579df617a0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.707850 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.707902 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6trvs\" (UniqueName: \"kubernetes.io/projected/7268e8e0-16cf-4b12-8593-e579df617a0e-kube-api-access-6trvs\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.707924 4778 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:56 crc kubenswrapper[4778]: I0930 17:36:56.707941 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7268e8e0-16cf-4b12-8593-e579df617a0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.093517 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-mt5l7" event={"ID":"7268e8e0-16cf-4b12-8593-e579df617a0e","Type":"ContainerDied","Data":"1f0e4a4e35a0bf4cf659e530b9a849541d678d553e636bd8c96c34d2bf97ee68"} Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.094011 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f0e4a4e35a0bf4cf659e530b9a849541d678d553e636bd8c96c34d2bf97ee68" Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.093611 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-mt5l7" Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.266986 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.267244 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-log" containerID="cri-o://f559aa002ba988092e649881bf7cd55402dca4a53e799efe53f1b8acd7e9d7b6" gracePeriod=30 Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.267323 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-api" containerID="cri-o://88891d050159c2339358a233f6b0de5f5824a47b1f78435b4ca2b78a9d24be88" gracePeriod=30 Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.281944 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.282215 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" containerName="nova-scheduler-scheduler" containerID="cri-o://64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" gracePeriod=30 Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.288457 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.179:8774/\": EOF" Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.292252 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.179:8774/\": EOF" Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.312232 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.312480 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-log" containerID="cri-o://cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76" gracePeriod=30 Sep 30 17:36:57 crc kubenswrapper[4778]: I0930 17:36:57.312586 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-metadata" containerID="cri-o://1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6" gracePeriod=30 Sep 30 17:36:58 crc kubenswrapper[4778]: E0930 17:36:58.034129 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:36:58 crc kubenswrapper[4778]: E0930 17:36:58.035949 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:36:58 crc kubenswrapper[4778]: E0930 17:36:58.037351 4778 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 17:36:58 crc kubenswrapper[4778]: E0930 17:36:58.037391 4778 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" containerName="nova-scheduler-scheduler" Sep 30 17:36:58 crc kubenswrapper[4778]: I0930 17:36:58.104438 4778 generic.go:334] "Generic (PLEG): container finished" podID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerID="cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76" exitCode=143 Sep 30 17:36:58 crc kubenswrapper[4778]: I0930 17:36:58.104498 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2072ccae-3380-4294-915d-3e3bba81b1f7","Type":"ContainerDied","Data":"cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76"} Sep 30 17:36:58 crc kubenswrapper[4778]: I0930 17:36:58.107016 4778 generic.go:334] "Generic (PLEG): container finished" podID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerID="f559aa002ba988092e649881bf7cd55402dca4a53e799efe53f1b8acd7e9d7b6" exitCode=143 Sep 30 17:36:58 crc kubenswrapper[4778]: I0930 17:36:58.107051 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a6cd22c-2090-4cc4-ac05-12ea37404383","Type":"ContainerDied","Data":"f559aa002ba988092e649881bf7cd55402dca4a53e799efe53f1b8acd7e9d7b6"} Sep 30 17:37:00 crc kubenswrapper[4778]: E0930 17:37:00.594739 4778 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2072ccae_3380_4294_915d_3e3bba81b1f7.slice/crio-1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2072ccae_3380_4294_915d_3e3bba81b1f7.slice/crio-conmon-1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6.scope\": RecentStats: unable to find data in memory cache]" Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.856431 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.989356 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-combined-ca-bundle\") pod \"2072ccae-3380-4294-915d-3e3bba81b1f7\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.989713 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2072ccae-3380-4294-915d-3e3bba81b1f7-logs\") pod \"2072ccae-3380-4294-915d-3e3bba81b1f7\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.989764 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqdjk\" (UniqueName: \"kubernetes.io/projected/2072ccae-3380-4294-915d-3e3bba81b1f7-kube-api-access-cqdjk\") pod \"2072ccae-3380-4294-915d-3e3bba81b1f7\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.989786 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-nova-metadata-tls-certs\") pod \"2072ccae-3380-4294-915d-3e3bba81b1f7\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.989840 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-config-data\") pod \"2072ccae-3380-4294-915d-3e3bba81b1f7\" (UID: \"2072ccae-3380-4294-915d-3e3bba81b1f7\") " Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.991874 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2072ccae-3380-4294-915d-3e3bba81b1f7-logs" (OuterVolumeSpecName: "logs") pod "2072ccae-3380-4294-915d-3e3bba81b1f7" (UID: "2072ccae-3380-4294-915d-3e3bba81b1f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:37:00 crc kubenswrapper[4778]: I0930 17:37:00.995356 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2072ccae-3380-4294-915d-3e3bba81b1f7-kube-api-access-cqdjk" (OuterVolumeSpecName: "kube-api-access-cqdjk") pod "2072ccae-3380-4294-915d-3e3bba81b1f7" (UID: "2072ccae-3380-4294-915d-3e3bba81b1f7"). InnerVolumeSpecName "kube-api-access-cqdjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.023935 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2072ccae-3380-4294-915d-3e3bba81b1f7" (UID: "2072ccae-3380-4294-915d-3e3bba81b1f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.024932 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-config-data" (OuterVolumeSpecName: "config-data") pod "2072ccae-3380-4294-915d-3e3bba81b1f7" (UID: "2072ccae-3380-4294-915d-3e3bba81b1f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.048712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "2072ccae-3380-4294-915d-3e3bba81b1f7" (UID: "2072ccae-3380-4294-915d-3e3bba81b1f7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.092209 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2072ccae-3380-4294-915d-3e3bba81b1f7-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.092235 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqdjk\" (UniqueName: \"kubernetes.io/projected/2072ccae-3380-4294-915d-3e3bba81b1f7-kube-api-access-cqdjk\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.092246 4778 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.092256 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.092265 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2072ccae-3380-4294-915d-3e3bba81b1f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.138734 4778 generic.go:334] "Generic (PLEG): container finished" podID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerID="1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6" exitCode=0 Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.138795 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2072ccae-3380-4294-915d-3e3bba81b1f7","Type":"ContainerDied","Data":"1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6"} Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.138812 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.138842 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"2072ccae-3380-4294-915d-3e3bba81b1f7","Type":"ContainerDied","Data":"7c96daf149ccc2a479e9e85cebcfde7117216eddc8150a8888a6273a4f133402"} Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.138872 4778 scope.go:117] "RemoveContainer" containerID="1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.182251 4778 scope.go:117] "RemoveContainer" containerID="cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.190086 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.201928 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208006 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.208436 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerName="init" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208459 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerName="init" Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.208473 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-metadata" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208483 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-metadata" Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.208497 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-log" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208505 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-log" Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.208525 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerName="dnsmasq-dns" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208533 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerName="dnsmasq-dns" Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.208560 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7268e8e0-16cf-4b12-8593-e579df617a0e" containerName="nova-manage" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208568 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="7268e8e0-16cf-4b12-8593-e579df617a0e" containerName="nova-manage" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208788 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-log" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208803 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" containerName="nova-metadata-metadata" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208816 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="7268e8e0-16cf-4b12-8593-e579df617a0e" containerName="nova-manage" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.208846 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f31c368-4084-49e9-8d47-c7eb8541b0d2" containerName="dnsmasq-dns" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.210186 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.212330 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.213153 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.216771 4778 scope.go:117] "RemoveContainer" containerID="1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6" Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.217169 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6\": container with ID starting with 1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6 not found: ID does not exist" containerID="1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.217211 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6"} err="failed to get container status \"1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6\": rpc error: code = NotFound desc = could not find container \"1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6\": container with ID starting with 1149cb5fe492549089b6958ac8ac4f97d3735242da8b537106e86fddf25f38f6 not found: ID does not exist" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.217239 4778 scope.go:117] "RemoveContainer" containerID="cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76" Sep 30 17:37:01 crc kubenswrapper[4778]: E0930 17:37:01.217509 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76\": container with ID starting with cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76 not found: ID does not exist" containerID="cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.217599 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76"} err="failed to get container status \"cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76\": rpc error: code = NotFound desc = could not find container \"cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76\": container with ID starting with cd14277033a20fbfac832ab4cc4ca24090783ebf5b1eebcdd9ccc66fb8fcee76 not found: ID does not exist" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.221456 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.396904 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-logs\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.396966 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.397013 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvp5n\" (UniqueName: \"kubernetes.io/projected/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-kube-api-access-hvp5n\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.397273 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-config-data\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.397523 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.499888 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-config-data\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.500019 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.500096 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-logs\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.500137 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.500176 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvp5n\" (UniqueName: \"kubernetes.io/projected/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-kube-api-access-hvp5n\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.501719 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-logs\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.505921 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.506138 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.507743 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-config-data\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.520005 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvp5n\" (UniqueName: \"kubernetes.io/projected/4f29fe8f-af06-4ffb-b611-af0bb9c5cebb-kube-api-access-hvp5n\") pod \"nova-metadata-0\" (UID: \"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb\") " pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.545469 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.723981 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2072ccae-3380-4294-915d-3e3bba81b1f7" path="/var/lib/kubelet/pods/2072ccae-3380-4294-915d-3e3bba81b1f7/volumes" Sep 30 17:37:01 crc kubenswrapper[4778]: I0930 17:37:01.899203 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.011452 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6gqr\" (UniqueName: \"kubernetes.io/projected/03376d6a-71aa-410f-9c28-d9beeb68dc6c-kube-api-access-c6gqr\") pod \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.011589 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-config-data\") pod \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.011677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-combined-ca-bundle\") pod \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\" (UID: \"03376d6a-71aa-410f-9c28-d9beeb68dc6c\") " Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.023103 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03376d6a-71aa-410f-9c28-d9beeb68dc6c-kube-api-access-c6gqr" (OuterVolumeSpecName: "kube-api-access-c6gqr") pod "03376d6a-71aa-410f-9c28-d9beeb68dc6c" (UID: "03376d6a-71aa-410f-9c28-d9beeb68dc6c"). InnerVolumeSpecName "kube-api-access-c6gqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.051151 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.057888 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-config-data" (OuterVolumeSpecName: "config-data") pod "03376d6a-71aa-410f-9c28-d9beeb68dc6c" (UID: "03376d6a-71aa-410f-9c28-d9beeb68dc6c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.058012 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03376d6a-71aa-410f-9c28-d9beeb68dc6c" (UID: "03376d6a-71aa-410f-9c28-d9beeb68dc6c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.113955 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6gqr\" (UniqueName: \"kubernetes.io/projected/03376d6a-71aa-410f-9c28-d9beeb68dc6c-kube-api-access-c6gqr\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.113989 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.114001 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03376d6a-71aa-410f-9c28-d9beeb68dc6c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.147922 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb","Type":"ContainerStarted","Data":"e1d64b8e7d3c41fbf3fce57c93e48e87fc4ef27b2a28c008d06ba740dc2af2ae"} Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.151191 4778 generic.go:334] "Generic (PLEG): container finished" podID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" exitCode=0 Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.151233 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03376d6a-71aa-410f-9c28-d9beeb68dc6c","Type":"ContainerDied","Data":"64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc"} Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.151259 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"03376d6a-71aa-410f-9c28-d9beeb68dc6c","Type":"ContainerDied","Data":"5b9b5baa43025c14e112e7bbbba33b7bbc9c42f91c709108de639d9313d1c22a"} Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.151274 4778 scope.go:117] "RemoveContainer" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.151404 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.171369 4778 scope.go:117] "RemoveContainer" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" Sep 30 17:37:02 crc kubenswrapper[4778]: E0930 17:37:02.171914 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc\": container with ID starting with 64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc not found: ID does not exist" containerID="64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.171955 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc"} err="failed to get container status \"64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc\": rpc error: code = NotFound desc = could not find container \"64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc\": container with ID starting with 64ba09d8d10ad0c06e3f27146824acbe8ee4ce2ecee827444a4584782e9849cc not found: ID does not exist" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.187163 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.198214 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.213217 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:37:02 crc kubenswrapper[4778]: E0930 17:37:02.213633 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" containerName="nova-scheduler-scheduler" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.213652 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" containerName="nova-scheduler-scheduler" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.213843 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" containerName="nova-scheduler-scheduler" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.214551 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.220256 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.222582 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.319522 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab3e484-85ba-4428-9020-04c11efe96aa-config-data\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.319745 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab3e484-85ba-4428-9020-04c11efe96aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.319771 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6z2\" (UniqueName: \"kubernetes.io/projected/dab3e484-85ba-4428-9020-04c11efe96aa-kube-api-access-4l6z2\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.422529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab3e484-85ba-4428-9020-04c11efe96aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.422569 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6z2\" (UniqueName: \"kubernetes.io/projected/dab3e484-85ba-4428-9020-04c11efe96aa-kube-api-access-4l6z2\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.422653 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab3e484-85ba-4428-9020-04c11efe96aa-config-data\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.426975 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dab3e484-85ba-4428-9020-04c11efe96aa-config-data\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.427216 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dab3e484-85ba-4428-9020-04c11efe96aa-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.442414 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6z2\" (UniqueName: \"kubernetes.io/projected/dab3e484-85ba-4428-9020-04c11efe96aa-kube-api-access-4l6z2\") pod \"nova-scheduler-0\" (UID: \"dab3e484-85ba-4428-9020-04c11efe96aa\") " pod="openstack/nova-scheduler-0" Sep 30 17:37:02 crc kubenswrapper[4778]: I0930 17:37:02.543999 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.026976 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 17:37:03 crc kubenswrapper[4778]: W0930 17:37:03.035137 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab3e484_85ba_4428_9020_04c11efe96aa.slice/crio-b825643f94f80bffecdb8780354559a21fba52782d8349037b9d2206f35b393f WatchSource:0}: Error finding container b825643f94f80bffecdb8780354559a21fba52782d8349037b9d2206f35b393f: Status 404 returned error can't find the container with id b825643f94f80bffecdb8780354559a21fba52782d8349037b9d2206f35b393f Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.163803 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dab3e484-85ba-4428-9020-04c11efe96aa","Type":"ContainerStarted","Data":"b825643f94f80bffecdb8780354559a21fba52782d8349037b9d2206f35b393f"} Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.165741 4778 generic.go:334] "Generic (PLEG): container finished" podID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerID="88891d050159c2339358a233f6b0de5f5824a47b1f78435b4ca2b78a9d24be88" exitCode=0 Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.165851 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a6cd22c-2090-4cc4-ac05-12ea37404383","Type":"ContainerDied","Data":"88891d050159c2339358a233f6b0de5f5824a47b1f78435b4ca2b78a9d24be88"} Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.165911 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"1a6cd22c-2090-4cc4-ac05-12ea37404383","Type":"ContainerDied","Data":"67d69112cd59b359009aa7ecb5204361da2e4bdb9d09e0d47ce3a30050720261"} Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.165926 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d69112cd59b359009aa7ecb5204361da2e4bdb9d09e0d47ce3a30050720261" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.168446 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb","Type":"ContainerStarted","Data":"a393fbbcdbdf1764fc5258c1d0daea2563081db9d8ab9c593b59d32b399d69e3"} Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.168474 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f29fe8f-af06-4ffb-b611-af0bb9c5cebb","Type":"ContainerStarted","Data":"3ed53d0408ba4b6dff5359dac80cd7655d9de64abf6e580aad1f07ffd31f10a2"} Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.192317 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.194567 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.194552552 podStartE2EDuration="2.194552552s" podCreationTimestamp="2025-09-30 17:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:37:03.192664063 +0000 UTC m=+1162.182561886" watchObservedRunningTime="2025-09-30 17:37:03.194552552 +0000 UTC m=+1162.184450355" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.340145 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdqsb\" (UniqueName: \"kubernetes.io/projected/1a6cd22c-2090-4cc4-ac05-12ea37404383-kube-api-access-qdqsb\") pod \"1a6cd22c-2090-4cc4-ac05-12ea37404383\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.340499 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-config-data\") pod \"1a6cd22c-2090-4cc4-ac05-12ea37404383\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.341206 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-public-tls-certs\") pod \"1a6cd22c-2090-4cc4-ac05-12ea37404383\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.341393 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd22c-2090-4cc4-ac05-12ea37404383-logs\") pod \"1a6cd22c-2090-4cc4-ac05-12ea37404383\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.341475 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-internal-tls-certs\") pod \"1a6cd22c-2090-4cc4-ac05-12ea37404383\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.341545 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-combined-ca-bundle\") pod \"1a6cd22c-2090-4cc4-ac05-12ea37404383\" (UID: \"1a6cd22c-2090-4cc4-ac05-12ea37404383\") " Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.341880 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6cd22c-2090-4cc4-ac05-12ea37404383-logs" (OuterVolumeSpecName: "logs") pod "1a6cd22c-2090-4cc4-ac05-12ea37404383" (UID: "1a6cd22c-2090-4cc4-ac05-12ea37404383"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.342125 4778 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a6cd22c-2090-4cc4-ac05-12ea37404383-logs\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.345698 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6cd22c-2090-4cc4-ac05-12ea37404383-kube-api-access-qdqsb" (OuterVolumeSpecName: "kube-api-access-qdqsb") pod "1a6cd22c-2090-4cc4-ac05-12ea37404383" (UID: "1a6cd22c-2090-4cc4-ac05-12ea37404383"). InnerVolumeSpecName "kube-api-access-qdqsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.369553 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a6cd22c-2090-4cc4-ac05-12ea37404383" (UID: "1a6cd22c-2090-4cc4-ac05-12ea37404383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.374903 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-config-data" (OuterVolumeSpecName: "config-data") pod "1a6cd22c-2090-4cc4-ac05-12ea37404383" (UID: "1a6cd22c-2090-4cc4-ac05-12ea37404383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.390256 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a6cd22c-2090-4cc4-ac05-12ea37404383" (UID: "1a6cd22c-2090-4cc4-ac05-12ea37404383"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.393079 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a6cd22c-2090-4cc4-ac05-12ea37404383" (UID: "1a6cd22c-2090-4cc4-ac05-12ea37404383"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.443368 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.443398 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdqsb\" (UniqueName: \"kubernetes.io/projected/1a6cd22c-2090-4cc4-ac05-12ea37404383-kube-api-access-qdqsb\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.443410 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.443420 4778 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.443428 4778 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a6cd22c-2090-4cc4-ac05-12ea37404383-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 17:37:03 crc kubenswrapper[4778]: I0930 17:37:03.727188 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03376d6a-71aa-410f-9c28-d9beeb68dc6c" path="/var/lib/kubelet/pods/03376d6a-71aa-410f-9c28-d9beeb68dc6c/volumes" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.181212 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dab3e484-85ba-4428-9020-04c11efe96aa","Type":"ContainerStarted","Data":"c17a130cfd14c1f6317a5841b8e4c0f4df5bab605780c711115ec525b4fc9dfc"} Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.181470 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.204848 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.204826075 podStartE2EDuration="2.204826075s" podCreationTimestamp="2025-09-30 17:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:37:04.202673137 +0000 UTC m=+1163.192570940" watchObservedRunningTime="2025-09-30 17:37:04.204826075 +0000 UTC m=+1163.194723878" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.242017 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.261751 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.272332 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 17:37:04 crc kubenswrapper[4778]: E0930 17:37:04.272967 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-log" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.272988 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-log" Sep 30 17:37:04 crc kubenswrapper[4778]: E0930 17:37:04.273019 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-api" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.273034 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-api" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.273361 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-log" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.273396 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" containerName="nova-api-api" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.275007 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.278545 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.279004 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.279346 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.294840 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.361179 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.361243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27d98d14-63d0-446f-8dcc-db21c137feb5-logs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.361277 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-config-data\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.361363 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9jzr\" (UniqueName: \"kubernetes.io/projected/27d98d14-63d0-446f-8dcc-db21c137feb5-kube-api-access-t9jzr\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.361400 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-public-tls-certs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.361457 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.463358 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9jzr\" (UniqueName: \"kubernetes.io/projected/27d98d14-63d0-446f-8dcc-db21c137feb5-kube-api-access-t9jzr\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.463439 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-public-tls-certs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.463499 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.463564 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.463596 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27d98d14-63d0-446f-8dcc-db21c137feb5-logs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.463641 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-config-data\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.465881 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/27d98d14-63d0-446f-8dcc-db21c137feb5-logs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.469594 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.469941 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-public-tls-certs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.470499 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-config-data\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.470948 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/27d98d14-63d0-446f-8dcc-db21c137feb5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.491825 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9jzr\" (UniqueName: \"kubernetes.io/projected/27d98d14-63d0-446f-8dcc-db21c137feb5-kube-api-access-t9jzr\") pod \"nova-api-0\" (UID: \"27d98d14-63d0-446f-8dcc-db21c137feb5\") " pod="openstack/nova-api-0" Sep 30 17:37:04 crc kubenswrapper[4778]: I0930 17:37:04.596987 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 17:37:05 crc kubenswrapper[4778]: I0930 17:37:05.078508 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 17:37:05 crc kubenswrapper[4778]: W0930 17:37:05.086084 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27d98d14_63d0_446f_8dcc_db21c137feb5.slice/crio-92e9e098b5f0fc650b3b937cfef41d917a0364223597b6e2c67b56b2f78915a3 WatchSource:0}: Error finding container 92e9e098b5f0fc650b3b937cfef41d917a0364223597b6e2c67b56b2f78915a3: Status 404 returned error can't find the container with id 92e9e098b5f0fc650b3b937cfef41d917a0364223597b6e2c67b56b2f78915a3 Sep 30 17:37:05 crc kubenswrapper[4778]: I0930 17:37:05.193066 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27d98d14-63d0-446f-8dcc-db21c137feb5","Type":"ContainerStarted","Data":"92e9e098b5f0fc650b3b937cfef41d917a0364223597b6e2c67b56b2f78915a3"} Sep 30 17:37:05 crc kubenswrapper[4778]: I0930 17:37:05.729090 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a6cd22c-2090-4cc4-ac05-12ea37404383" path="/var/lib/kubelet/pods/1a6cd22c-2090-4cc4-ac05-12ea37404383/volumes" Sep 30 17:37:06 crc kubenswrapper[4778]: I0930 17:37:06.211538 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27d98d14-63d0-446f-8dcc-db21c137feb5","Type":"ContainerStarted","Data":"e9dccb2555208ec7b86c1f822d364b54b70b9128abf9cbfa2abcadc8a3f3a685"} Sep 30 17:37:06 crc kubenswrapper[4778]: I0930 17:37:06.211784 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"27d98d14-63d0-446f-8dcc-db21c137feb5","Type":"ContainerStarted","Data":"a6d2482bd254992c5f5fc8a65f16650c39ae86f17bd485caa1541234879a1aca"} Sep 30 17:37:06 crc kubenswrapper[4778]: I0930 17:37:06.252246 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.252209058 podStartE2EDuration="2.252209058s" podCreationTimestamp="2025-09-30 17:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 17:37:06.236846933 +0000 UTC m=+1165.226744726" watchObservedRunningTime="2025-09-30 17:37:06.252209058 +0000 UTC m=+1165.242106861" Sep 30 17:37:06 crc kubenswrapper[4778]: I0930 17:37:06.545815 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:37:06 crc kubenswrapper[4778]: I0930 17:37:06.545956 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 17:37:07 crc kubenswrapper[4778]: I0930 17:37:07.544743 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 17:37:11 crc kubenswrapper[4778]: I0930 17:37:11.546072 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:37:11 crc kubenswrapper[4778]: I0930 17:37:11.546659 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 17:37:12 crc kubenswrapper[4778]: I0930 17:37:12.545112 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 17:37:12 crc kubenswrapper[4778]: I0930 17:37:12.557782 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f29fe8f-af06-4ffb-b611-af0bb9c5cebb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:37:12 crc kubenswrapper[4778]: I0930 17:37:12.557907 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f29fe8f-af06-4ffb-b611-af0bb9c5cebb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.181:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:37:12 crc kubenswrapper[4778]: I0930 17:37:12.576580 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 17:37:13 crc kubenswrapper[4778]: I0930 17:37:13.307906 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 17:37:14 crc kubenswrapper[4778]: I0930 17:37:14.597337 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:37:14 crc kubenswrapper[4778]: I0930 17:37:14.597432 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 17:37:15 crc kubenswrapper[4778]: I0930 17:37:15.615927 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27d98d14-63d0-446f-8dcc-db21c137feb5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.183:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:37:15 crc kubenswrapper[4778]: I0930 17:37:15.615971 4778 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="27d98d14-63d0-446f-8dcc-db21c137feb5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.183:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 17:37:21 crc kubenswrapper[4778]: I0930 17:37:21.554441 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 17:37:21 crc kubenswrapper[4778]: I0930 17:37:21.556266 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 17:37:21 crc kubenswrapper[4778]: I0930 17:37:21.564338 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 17:37:22 crc kubenswrapper[4778]: I0930 17:37:22.389449 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 17:37:24 crc kubenswrapper[4778]: I0930 17:37:24.607610 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:37:24 crc kubenswrapper[4778]: I0930 17:37:24.608146 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 17:37:24 crc kubenswrapper[4778]: I0930 17:37:24.608600 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:37:24 crc kubenswrapper[4778]: I0930 17:37:24.608700 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 17:37:24 crc kubenswrapper[4778]: I0930 17:37:24.618927 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:37:24 crc kubenswrapper[4778]: I0930 17:37:24.620717 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 17:37:44 crc kubenswrapper[4778]: I0930 17:37:44.812548 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:37:44 crc kubenswrapper[4778]: I0930 17:37:44.813489 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:38:14 crc kubenswrapper[4778]: I0930 17:38:14.811803 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:38:14 crc kubenswrapper[4778]: I0930 17:38:14.812446 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:38:44 crc kubenswrapper[4778]: I0930 17:38:44.812337 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:38:44 crc kubenswrapper[4778]: I0930 17:38:44.813014 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:38:44 crc kubenswrapper[4778]: I0930 17:38:44.813079 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:38:44 crc kubenswrapper[4778]: I0930 17:38:44.814023 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"326830dff025b3a623c0b61fcab0be0c2c9f34db066bfe20235b1b092a5d8935"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:38:44 crc kubenswrapper[4778]: I0930 17:38:44.814114 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://326830dff025b3a623c0b61fcab0be0c2c9f34db066bfe20235b1b092a5d8935" gracePeriod=600 Sep 30 17:38:45 crc kubenswrapper[4778]: I0930 17:38:45.180632 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="326830dff025b3a623c0b61fcab0be0c2c9f34db066bfe20235b1b092a5d8935" exitCode=0 Sep 30 17:38:45 crc kubenswrapper[4778]: I0930 17:38:45.180649 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"326830dff025b3a623c0b61fcab0be0c2c9f34db066bfe20235b1b092a5d8935"} Sep 30 17:38:45 crc kubenswrapper[4778]: I0930 17:38:45.181015 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"9bb2b540381c42fbc380e8cd8f4bc8733104d08b1605c2c2155245fff3463fd6"} Sep 30 17:38:45 crc kubenswrapper[4778]: I0930 17:38:45.181048 4778 scope.go:117] "RemoveContainer" containerID="97044f3dbaff452261d88827459c4da3b10678dd945a7792a4a92fff2dc6be50" Sep 30 17:39:45 crc kubenswrapper[4778]: I0930 17:39:45.536650 4778 scope.go:117] "RemoveContainer" containerID="b7b9816c46f54a3b0a286e5f7c0f0f356e4366bd434c026da3a48a1b37314b27" Sep 30 17:40:45 crc kubenswrapper[4778]: I0930 17:40:45.611130 4778 scope.go:117] "RemoveContainer" containerID="fc1237327068b1110e5968051edbe956cc9e3480ea6c35718ae2d5ebf2c0a89d" Sep 30 17:40:45 crc kubenswrapper[4778]: I0930 17:40:45.646529 4778 scope.go:117] "RemoveContainer" containerID="d7fa2809b29ef0fb7beb23c0ca2cc0508bca1d95a64f78a90519afaf4e3c1a3b" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.563436 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-56khl"] Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.566172 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.597797 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-56khl"] Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.700313 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frxzs\" (UniqueName: \"kubernetes.io/projected/be9b8b25-b79e-4236-836d-f04b4c570782-kube-api-access-frxzs\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.700365 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-utilities\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.700418 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-catalog-content\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.806741 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-utilities\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.806935 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-catalog-content\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.807136 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frxzs\" (UniqueName: \"kubernetes.io/projected/be9b8b25-b79e-4236-836d-f04b4c570782-kube-api-access-frxzs\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.807315 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-utilities\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.808266 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-catalog-content\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.812206 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.812365 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.832007 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frxzs\" (UniqueName: \"kubernetes.io/projected/be9b8b25-b79e-4236-836d-f04b4c570782-kube-api-access-frxzs\") pod \"redhat-marketplace-56khl\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:14 crc kubenswrapper[4778]: I0930 17:41:14.895148 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:15 crc kubenswrapper[4778]: I0930 17:41:15.353629 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-56khl"] Sep 30 17:41:15 crc kubenswrapper[4778]: W0930 17:41:15.361498 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe9b8b25_b79e_4236_836d_f04b4c570782.slice/crio-2d6649fc1a9dce49bdec61cc1fe531917da3e46d9a12d3f5a35a6df559db1627 WatchSource:0}: Error finding container 2d6649fc1a9dce49bdec61cc1fe531917da3e46d9a12d3f5a35a6df559db1627: Status 404 returned error can't find the container with id 2d6649fc1a9dce49bdec61cc1fe531917da3e46d9a12d3f5a35a6df559db1627 Sep 30 17:41:15 crc kubenswrapper[4778]: I0930 17:41:15.711556 4778 generic.go:334] "Generic (PLEG): container finished" podID="be9b8b25-b79e-4236-836d-f04b4c570782" containerID="3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744" exitCode=0 Sep 30 17:41:15 crc kubenswrapper[4778]: I0930 17:41:15.711654 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56khl" event={"ID":"be9b8b25-b79e-4236-836d-f04b4c570782","Type":"ContainerDied","Data":"3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744"} Sep 30 17:41:15 crc kubenswrapper[4778]: I0930 17:41:15.711990 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56khl" event={"ID":"be9b8b25-b79e-4236-836d-f04b4c570782","Type":"ContainerStarted","Data":"2d6649fc1a9dce49bdec61cc1fe531917da3e46d9a12d3f5a35a6df559db1627"} Sep 30 17:41:15 crc kubenswrapper[4778]: I0930 17:41:15.714063 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:41:17 crc kubenswrapper[4778]: I0930 17:41:17.734015 4778 generic.go:334] "Generic (PLEG): container finished" podID="be9b8b25-b79e-4236-836d-f04b4c570782" containerID="7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8" exitCode=0 Sep 30 17:41:17 crc kubenswrapper[4778]: I0930 17:41:17.734530 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56khl" event={"ID":"be9b8b25-b79e-4236-836d-f04b4c570782","Type":"ContainerDied","Data":"7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8"} Sep 30 17:41:19 crc kubenswrapper[4778]: I0930 17:41:19.757865 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56khl" event={"ID":"be9b8b25-b79e-4236-836d-f04b4c570782","Type":"ContainerStarted","Data":"b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16"} Sep 30 17:41:19 crc kubenswrapper[4778]: I0930 17:41:19.788806 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-56khl" podStartSLOduration=2.514208439 podStartE2EDuration="5.788781223s" podCreationTimestamp="2025-09-30 17:41:14 +0000 UTC" firstStartedPulling="2025-09-30 17:41:15.713845308 +0000 UTC m=+1414.703743111" lastFinishedPulling="2025-09-30 17:41:18.988418052 +0000 UTC m=+1417.978315895" observedRunningTime="2025-09-30 17:41:19.782817717 +0000 UTC m=+1418.772715550" watchObservedRunningTime="2025-09-30 17:41:19.788781223 +0000 UTC m=+1418.778679056" Sep 30 17:41:24 crc kubenswrapper[4778]: I0930 17:41:24.896021 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:24 crc kubenswrapper[4778]: I0930 17:41:24.897014 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:24 crc kubenswrapper[4778]: I0930 17:41:24.958920 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:25 crc kubenswrapper[4778]: I0930 17:41:25.918379 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:25 crc kubenswrapper[4778]: I0930 17:41:25.983891 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-56khl"] Sep 30 17:41:27 crc kubenswrapper[4778]: I0930 17:41:27.846288 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-56khl" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="registry-server" containerID="cri-o://b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16" gracePeriod=2 Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.341278 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.479337 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frxzs\" (UniqueName: \"kubernetes.io/projected/be9b8b25-b79e-4236-836d-f04b4c570782-kube-api-access-frxzs\") pod \"be9b8b25-b79e-4236-836d-f04b4c570782\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.479411 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-utilities\") pod \"be9b8b25-b79e-4236-836d-f04b4c570782\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.479447 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-catalog-content\") pod \"be9b8b25-b79e-4236-836d-f04b4c570782\" (UID: \"be9b8b25-b79e-4236-836d-f04b4c570782\") " Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.480859 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-utilities" (OuterVolumeSpecName: "utilities") pod "be9b8b25-b79e-4236-836d-f04b4c570782" (UID: "be9b8b25-b79e-4236-836d-f04b4c570782"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.485102 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be9b8b25-b79e-4236-836d-f04b4c570782-kube-api-access-frxzs" (OuterVolumeSpecName: "kube-api-access-frxzs") pod "be9b8b25-b79e-4236-836d-f04b4c570782" (UID: "be9b8b25-b79e-4236-836d-f04b4c570782"). InnerVolumeSpecName "kube-api-access-frxzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.494491 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be9b8b25-b79e-4236-836d-f04b4c570782" (UID: "be9b8b25-b79e-4236-836d-f04b4c570782"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.581282 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frxzs\" (UniqueName: \"kubernetes.io/projected/be9b8b25-b79e-4236-836d-f04b4c570782-kube-api-access-frxzs\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.581312 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.581321 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be9b8b25-b79e-4236-836d-f04b4c570782-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.858985 4778 generic.go:334] "Generic (PLEG): container finished" podID="be9b8b25-b79e-4236-836d-f04b4c570782" containerID="b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16" exitCode=0 Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.859088 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-56khl" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.859088 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56khl" event={"ID":"be9b8b25-b79e-4236-836d-f04b4c570782","Type":"ContainerDied","Data":"b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16"} Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.859558 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-56khl" event={"ID":"be9b8b25-b79e-4236-836d-f04b4c570782","Type":"ContainerDied","Data":"2d6649fc1a9dce49bdec61cc1fe531917da3e46d9a12d3f5a35a6df559db1627"} Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.859584 4778 scope.go:117] "RemoveContainer" containerID="b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.877224 4778 scope.go:117] "RemoveContainer" containerID="7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.911534 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-56khl"] Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.921241 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-56khl"] Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.924490 4778 scope.go:117] "RemoveContainer" containerID="3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.947695 4778 scope.go:117] "RemoveContainer" containerID="b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16" Sep 30 17:41:28 crc kubenswrapper[4778]: E0930 17:41:28.948280 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16\": container with ID starting with b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16 not found: ID does not exist" containerID="b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.948334 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16"} err="failed to get container status \"b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16\": rpc error: code = NotFound desc = could not find container \"b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16\": container with ID starting with b7d392101b7f5404da0b8cef347d842577ed7771f31f1ba8d078054722b1fd16 not found: ID does not exist" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.948368 4778 scope.go:117] "RemoveContainer" containerID="7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8" Sep 30 17:41:28 crc kubenswrapper[4778]: E0930 17:41:28.948783 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8\": container with ID starting with 7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8 not found: ID does not exist" containerID="7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.948815 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8"} err="failed to get container status \"7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8\": rpc error: code = NotFound desc = could not find container \"7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8\": container with ID starting with 7ca5c7a04f099e2e231c70d7a31bf868777468cad72c799da987cc1e2dc30ad8 not found: ID does not exist" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.948837 4778 scope.go:117] "RemoveContainer" containerID="3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744" Sep 30 17:41:28 crc kubenswrapper[4778]: E0930 17:41:28.949344 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744\": container with ID starting with 3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744 not found: ID does not exist" containerID="3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744" Sep 30 17:41:28 crc kubenswrapper[4778]: I0930 17:41:28.949452 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744"} err="failed to get container status \"3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744\": rpc error: code = NotFound desc = could not find container \"3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744\": container with ID starting with 3c36284f03587849ef29829f4c829cac3c6728330945099f41dac603b4537744 not found: ID does not exist" Sep 30 17:41:29 crc kubenswrapper[4778]: I0930 17:41:29.763099 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" path="/var/lib/kubelet/pods/be9b8b25-b79e-4236-836d-f04b4c570782/volumes" Sep 30 17:41:44 crc kubenswrapper[4778]: I0930 17:41:44.812003 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:41:44 crc kubenswrapper[4778]: I0930 17:41:44.812533 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.130501 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mw5lr"] Sep 30 17:42:01 crc kubenswrapper[4778]: E0930 17:42:01.131473 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="extract-utilities" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.131491 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="extract-utilities" Sep 30 17:42:01 crc kubenswrapper[4778]: E0930 17:42:01.131514 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="registry-server" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.131520 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="registry-server" Sep 30 17:42:01 crc kubenswrapper[4778]: E0930 17:42:01.131543 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="extract-content" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.131548 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="extract-content" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.131824 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="be9b8b25-b79e-4236-836d-f04b4c570782" containerName="registry-server" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.133048 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.151013 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mw5lr"] Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.307147 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-utilities\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.307243 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trqpb\" (UniqueName: \"kubernetes.io/projected/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-kube-api-access-trqpb\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.307302 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-catalog-content\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.408529 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-catalog-content\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.408711 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-utilities\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.408813 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trqpb\" (UniqueName: \"kubernetes.io/projected/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-kube-api-access-trqpb\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.409288 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-catalog-content\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.409643 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-utilities\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.444947 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trqpb\" (UniqueName: \"kubernetes.io/projected/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-kube-api-access-trqpb\") pod \"community-operators-mw5lr\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.450522 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:01 crc kubenswrapper[4778]: I0930 17:42:01.974084 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mw5lr"] Sep 30 17:42:02 crc kubenswrapper[4778]: I0930 17:42:02.230156 4778 generic.go:334] "Generic (PLEG): container finished" podID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerID="6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc" exitCode=0 Sep 30 17:42:02 crc kubenswrapper[4778]: I0930 17:42:02.230340 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mw5lr" event={"ID":"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad","Type":"ContainerDied","Data":"6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc"} Sep 30 17:42:02 crc kubenswrapper[4778]: I0930 17:42:02.230584 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mw5lr" event={"ID":"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad","Type":"ContainerStarted","Data":"94fdfec76d99dffa2b433a495cd1e4bfea91d98ecfe878e2947141e328ac91ab"} Sep 30 17:42:04 crc kubenswrapper[4778]: I0930 17:42:04.256297 4778 generic.go:334] "Generic (PLEG): container finished" podID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerID="4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6" exitCode=0 Sep 30 17:42:04 crc kubenswrapper[4778]: I0930 17:42:04.256397 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mw5lr" event={"ID":"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad","Type":"ContainerDied","Data":"4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6"} Sep 30 17:42:06 crc kubenswrapper[4778]: I0930 17:42:06.284786 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mw5lr" event={"ID":"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad","Type":"ContainerStarted","Data":"024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f"} Sep 30 17:42:06 crc kubenswrapper[4778]: I0930 17:42:06.315245 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mw5lr" podStartSLOduration=2.382324428 podStartE2EDuration="5.315219768s" podCreationTimestamp="2025-09-30 17:42:01 +0000 UTC" firstStartedPulling="2025-09-30 17:42:02.234847103 +0000 UTC m=+1461.224744906" lastFinishedPulling="2025-09-30 17:42:05.167742433 +0000 UTC m=+1464.157640246" observedRunningTime="2025-09-30 17:42:06.311183912 +0000 UTC m=+1465.301081765" watchObservedRunningTime="2025-09-30 17:42:06.315219768 +0000 UTC m=+1465.305117611" Sep 30 17:42:11 crc kubenswrapper[4778]: I0930 17:42:11.451559 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:11 crc kubenswrapper[4778]: I0930 17:42:11.452305 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:11 crc kubenswrapper[4778]: I0930 17:42:11.508272 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:12 crc kubenswrapper[4778]: I0930 17:42:12.408302 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:12 crc kubenswrapper[4778]: I0930 17:42:12.466721 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mw5lr"] Sep 30 17:42:14 crc kubenswrapper[4778]: I0930 17:42:14.354143 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mw5lr" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="registry-server" containerID="cri-o://024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f" gracePeriod=2 Sep 30 17:42:14 crc kubenswrapper[4778]: I0930 17:42:14.812305 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:42:14 crc kubenswrapper[4778]: I0930 17:42:14.812381 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:42:14 crc kubenswrapper[4778]: I0930 17:42:14.812458 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:42:14 crc kubenswrapper[4778]: I0930 17:42:14.813390 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9bb2b540381c42fbc380e8cd8f4bc8733104d08b1605c2c2155245fff3463fd6"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:42:14 crc kubenswrapper[4778]: I0930 17:42:14.813505 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://9bb2b540381c42fbc380e8cd8f4bc8733104d08b1605c2c2155245fff3463fd6" gracePeriod=600 Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.351539 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.366712 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="9bb2b540381c42fbc380e8cd8f4bc8733104d08b1605c2c2155245fff3463fd6" exitCode=0 Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.366781 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"9bb2b540381c42fbc380e8cd8f4bc8733104d08b1605c2c2155245fff3463fd6"} Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.366815 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312"} Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.366837 4778 scope.go:117] "RemoveContainer" containerID="326830dff025b3a623c0b61fcab0be0c2c9f34db066bfe20235b1b092a5d8935" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.376417 4778 generic.go:334] "Generic (PLEG): container finished" podID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerID="024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f" exitCode=0 Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.376461 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mw5lr" event={"ID":"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad","Type":"ContainerDied","Data":"024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f"} Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.376491 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mw5lr" event={"ID":"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad","Type":"ContainerDied","Data":"94fdfec76d99dffa2b433a495cd1e4bfea91d98ecfe878e2947141e328ac91ab"} Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.376555 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mw5lr" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.423242 4778 scope.go:117] "RemoveContainer" containerID="024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.428010 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-utilities\") pod \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.428151 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trqpb\" (UniqueName: \"kubernetes.io/projected/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-kube-api-access-trqpb\") pod \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.428599 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-catalog-content\") pod \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\" (UID: \"42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad\") " Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.429214 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-utilities" (OuterVolumeSpecName: "utilities") pod "42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" (UID: "42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.429498 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.434820 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-kube-api-access-trqpb" (OuterVolumeSpecName: "kube-api-access-trqpb") pod "42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" (UID: "42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad"). InnerVolumeSpecName "kube-api-access-trqpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.448726 4778 scope.go:117] "RemoveContainer" containerID="4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.474731 4778 scope.go:117] "RemoveContainer" containerID="6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.490999 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" (UID: "42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.530800 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.530837 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trqpb\" (UniqueName: \"kubernetes.io/projected/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad-kube-api-access-trqpb\") on node \"crc\" DevicePath \"\"" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.532368 4778 scope.go:117] "RemoveContainer" containerID="024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f" Sep 30 17:42:15 crc kubenswrapper[4778]: E0930 17:42:15.532722 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f\": container with ID starting with 024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f not found: ID does not exist" containerID="024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.532762 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f"} err="failed to get container status \"024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f\": rpc error: code = NotFound desc = could not find container \"024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f\": container with ID starting with 024e0838e3d5e942e14cac467997df6bc3b93585addb8f9d6114582a0488171f not found: ID does not exist" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.532788 4778 scope.go:117] "RemoveContainer" containerID="4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6" Sep 30 17:42:15 crc kubenswrapper[4778]: E0930 17:42:15.533001 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6\": container with ID starting with 4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6 not found: ID does not exist" containerID="4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.533026 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6"} err="failed to get container status \"4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6\": rpc error: code = NotFound desc = could not find container \"4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6\": container with ID starting with 4b6663cae2ebe3e54307f2a19bf772f28182cd2c1beff19aea6fe8e6dcfa7cc6 not found: ID does not exist" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.533039 4778 scope.go:117] "RemoveContainer" containerID="6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc" Sep 30 17:42:15 crc kubenswrapper[4778]: E0930 17:42:15.533246 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc\": container with ID starting with 6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc not found: ID does not exist" containerID="6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.533271 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc"} err="failed to get container status \"6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc\": rpc error: code = NotFound desc = could not find container \"6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc\": container with ID starting with 6705ac9133f555ae03c0f4096f1dbf43efd80db3bdbce4a6a2e364d2bfe399fc not found: ID does not exist" Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.735432 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mw5lr"] Sep 30 17:42:15 crc kubenswrapper[4778]: I0930 17:42:15.743121 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mw5lr"] Sep 30 17:42:17 crc kubenswrapper[4778]: I0930 17:42:17.731906 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" path="/var/lib/kubelet/pods/42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad/volumes" Sep 30 17:42:45 crc kubenswrapper[4778]: I0930 17:42:45.798324 4778 scope.go:117] "RemoveContainer" containerID="9399b08040b6e3234013278aa69986bf9af8df5134d42c7c0dfbfb4397c02dd0" Sep 30 17:42:45 crc kubenswrapper[4778]: I0930 17:42:45.829906 4778 scope.go:117] "RemoveContainer" containerID="0cca25f47fab96516cced40e050dd5b41df8ec24b24e971fb35e4eea27f0fbab" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.599890 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8ffvn"] Sep 30 17:43:02 crc kubenswrapper[4778]: E0930 17:43:02.603420 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="registry-server" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.603661 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="registry-server" Sep 30 17:43:02 crc kubenswrapper[4778]: E0930 17:43:02.603856 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="extract-utilities" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.604017 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="extract-utilities" Sep 30 17:43:02 crc kubenswrapper[4778]: E0930 17:43:02.604169 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="extract-content" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.604285 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="extract-content" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.604863 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e5671a-6b1d-4d7a-8ce4-fa3eca55d8ad" containerName="registry-server" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.633292 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ffvn"] Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.633443 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.743310 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wn9tk\" (UniqueName: \"kubernetes.io/projected/274011cd-4114-43df-8623-c03d1911c27b-kube-api-access-wn9tk\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.743365 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-utilities\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.743473 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-catalog-content\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.844897 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wn9tk\" (UniqueName: \"kubernetes.io/projected/274011cd-4114-43df-8623-c03d1911c27b-kube-api-access-wn9tk\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.844980 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-utilities\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.845058 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-catalog-content\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.847061 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-utilities\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.847087 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-catalog-content\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.863309 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wn9tk\" (UniqueName: \"kubernetes.io/projected/274011cd-4114-43df-8623-c03d1911c27b-kube-api-access-wn9tk\") pod \"certified-operators-8ffvn\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:02 crc kubenswrapper[4778]: I0930 17:43:02.966245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:03 crc kubenswrapper[4778]: I0930 17:43:03.472158 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8ffvn"] Sep 30 17:43:03 crc kubenswrapper[4778]: W0930 17:43:03.476536 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod274011cd_4114_43df_8623_c03d1911c27b.slice/crio-3c13437db2059acabbfa501137ff7fcfca8909ceadcc83e15b801e909f6527d8 WatchSource:0}: Error finding container 3c13437db2059acabbfa501137ff7fcfca8909ceadcc83e15b801e909f6527d8: Status 404 returned error can't find the container with id 3c13437db2059acabbfa501137ff7fcfca8909ceadcc83e15b801e909f6527d8 Sep 30 17:43:03 crc kubenswrapper[4778]: I0930 17:43:03.914743 4778 generic.go:334] "Generic (PLEG): container finished" podID="274011cd-4114-43df-8623-c03d1911c27b" containerID="f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8" exitCode=0 Sep 30 17:43:03 crc kubenswrapper[4778]: I0930 17:43:03.914820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ffvn" event={"ID":"274011cd-4114-43df-8623-c03d1911c27b","Type":"ContainerDied","Data":"f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8"} Sep 30 17:43:03 crc kubenswrapper[4778]: I0930 17:43:03.915133 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ffvn" event={"ID":"274011cd-4114-43df-8623-c03d1911c27b","Type":"ContainerStarted","Data":"3c13437db2059acabbfa501137ff7fcfca8909ceadcc83e15b801e909f6527d8"} Sep 30 17:43:05 crc kubenswrapper[4778]: I0930 17:43:05.936593 4778 generic.go:334] "Generic (PLEG): container finished" podID="274011cd-4114-43df-8623-c03d1911c27b" containerID="e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c" exitCode=0 Sep 30 17:43:05 crc kubenswrapper[4778]: I0930 17:43:05.936705 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ffvn" event={"ID":"274011cd-4114-43df-8623-c03d1911c27b","Type":"ContainerDied","Data":"e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c"} Sep 30 17:43:06 crc kubenswrapper[4778]: I0930 17:43:06.946453 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ffvn" event={"ID":"274011cd-4114-43df-8623-c03d1911c27b","Type":"ContainerStarted","Data":"171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6"} Sep 30 17:43:06 crc kubenswrapper[4778]: I0930 17:43:06.971881 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8ffvn" podStartSLOduration=2.485815787 podStartE2EDuration="4.971861203s" podCreationTimestamp="2025-09-30 17:43:02 +0000 UTC" firstStartedPulling="2025-09-30 17:43:03.916516673 +0000 UTC m=+1522.906414476" lastFinishedPulling="2025-09-30 17:43:06.402562049 +0000 UTC m=+1525.392459892" observedRunningTime="2025-09-30 17:43:06.966682372 +0000 UTC m=+1525.956580245" watchObservedRunningTime="2025-09-30 17:43:06.971861203 +0000 UTC m=+1525.961759016" Sep 30 17:43:12 crc kubenswrapper[4778]: I0930 17:43:12.967053 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:12 crc kubenswrapper[4778]: I0930 17:43:12.967581 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:13 crc kubenswrapper[4778]: I0930 17:43:13.062649 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:13 crc kubenswrapper[4778]: I0930 17:43:13.147433 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:13 crc kubenswrapper[4778]: I0930 17:43:13.306151 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ffvn"] Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.040804 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8ffvn" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="registry-server" containerID="cri-o://171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6" gracePeriod=2 Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.545090 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.587647 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-catalog-content\") pod \"274011cd-4114-43df-8623-c03d1911c27b\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.587947 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-utilities\") pod \"274011cd-4114-43df-8623-c03d1911c27b\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.588002 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wn9tk\" (UniqueName: \"kubernetes.io/projected/274011cd-4114-43df-8623-c03d1911c27b-kube-api-access-wn9tk\") pod \"274011cd-4114-43df-8623-c03d1911c27b\" (UID: \"274011cd-4114-43df-8623-c03d1911c27b\") " Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.592880 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-utilities" (OuterVolumeSpecName: "utilities") pod "274011cd-4114-43df-8623-c03d1911c27b" (UID: "274011cd-4114-43df-8623-c03d1911c27b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.596142 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/274011cd-4114-43df-8623-c03d1911c27b-kube-api-access-wn9tk" (OuterVolumeSpecName: "kube-api-access-wn9tk") pod "274011cd-4114-43df-8623-c03d1911c27b" (UID: "274011cd-4114-43df-8623-c03d1911c27b"). InnerVolumeSpecName "kube-api-access-wn9tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.658130 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "274011cd-4114-43df-8623-c03d1911c27b" (UID: "274011cd-4114-43df-8623-c03d1911c27b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.690795 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.690821 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wn9tk\" (UniqueName: \"kubernetes.io/projected/274011cd-4114-43df-8623-c03d1911c27b-kube-api-access-wn9tk\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:15 crc kubenswrapper[4778]: I0930 17:43:15.690843 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/274011cd-4114-43df-8623-c03d1911c27b-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.060814 4778 generic.go:334] "Generic (PLEG): container finished" podID="274011cd-4114-43df-8623-c03d1911c27b" containerID="171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6" exitCode=0 Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.060869 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ffvn" event={"ID":"274011cd-4114-43df-8623-c03d1911c27b","Type":"ContainerDied","Data":"171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6"} Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.060901 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8ffvn" event={"ID":"274011cd-4114-43df-8623-c03d1911c27b","Type":"ContainerDied","Data":"3c13437db2059acabbfa501137ff7fcfca8909ceadcc83e15b801e909f6527d8"} Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.060924 4778 scope.go:117] "RemoveContainer" containerID="171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.061777 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8ffvn" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.112590 4778 scope.go:117] "RemoveContainer" containerID="e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.113705 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8ffvn"] Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.123475 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8ffvn"] Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.135174 4778 scope.go:117] "RemoveContainer" containerID="f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.174525 4778 scope.go:117] "RemoveContainer" containerID="171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6" Sep 30 17:43:16 crc kubenswrapper[4778]: E0930 17:43:16.174993 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6\": container with ID starting with 171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6 not found: ID does not exist" containerID="171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.175043 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6"} err="failed to get container status \"171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6\": rpc error: code = NotFound desc = could not find container \"171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6\": container with ID starting with 171a5dcec3c343d4a963ed9679d07c9540dfd073daed0fd0a48a179e1f48fee6 not found: ID does not exist" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.175078 4778 scope.go:117] "RemoveContainer" containerID="e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c" Sep 30 17:43:16 crc kubenswrapper[4778]: E0930 17:43:16.175468 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c\": container with ID starting with e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c not found: ID does not exist" containerID="e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.175500 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c"} err="failed to get container status \"e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c\": rpc error: code = NotFound desc = could not find container \"e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c\": container with ID starting with e9d30d904b10e9116363f6f67bfcbfd6819f5e8cf408055b8312b59347797e8c not found: ID does not exist" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.175546 4778 scope.go:117] "RemoveContainer" containerID="f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8" Sep 30 17:43:16 crc kubenswrapper[4778]: E0930 17:43:16.175913 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8\": container with ID starting with f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8 not found: ID does not exist" containerID="f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8" Sep 30 17:43:16 crc kubenswrapper[4778]: I0930 17:43:16.175940 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8"} err="failed to get container status \"f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8\": rpc error: code = NotFound desc = could not find container \"f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8\": container with ID starting with f5994def70e17367f50cada806441152848f793c73c90fafd889ea5936aefba8 not found: ID does not exist" Sep 30 17:43:17 crc kubenswrapper[4778]: I0930 17:43:17.732664 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="274011cd-4114-43df-8623-c03d1911c27b" path="/var/lib/kubelet/pods/274011cd-4114-43df-8623-c03d1911c27b/volumes" Sep 30 17:43:22 crc kubenswrapper[4778]: I0930 17:43:22.050304 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-kfwtc"] Sep 30 17:43:22 crc kubenswrapper[4778]: I0930 17:43:22.062221 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-gtlnx"] Sep 30 17:43:22 crc kubenswrapper[4778]: I0930 17:43:22.070253 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-kfwtc"] Sep 30 17:43:22 crc kubenswrapper[4778]: I0930 17:43:22.078240 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-gtlnx"] Sep 30 17:43:23 crc kubenswrapper[4778]: I0930 17:43:23.729943 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58ce14f2-48e1-4f93-ab82-0a35af3a4e99" path="/var/lib/kubelet/pods/58ce14f2-48e1-4f93-ab82-0a35af3a4e99/volumes" Sep 30 17:43:23 crc kubenswrapper[4778]: I0930 17:43:23.731577 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecf58537-7743-4ba0-b78a-2286a7f55c4c" path="/var/lib/kubelet/pods/ecf58537-7743-4ba0-b78a-2286a7f55c4c/volumes" Sep 30 17:43:28 crc kubenswrapper[4778]: I0930 17:43:28.034158 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vqn7w"] Sep 30 17:43:28 crc kubenswrapper[4778]: I0930 17:43:28.043567 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vqn7w"] Sep 30 17:43:29 crc kubenswrapper[4778]: I0930 17:43:29.750707 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cae8b89-9230-48a1-9cdd-c604911a37f8" path="/var/lib/kubelet/pods/9cae8b89-9230-48a1-9cdd-c604911a37f8/volumes" Sep 30 17:43:31 crc kubenswrapper[4778]: I0930 17:43:31.054845 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e75d-account-create-ntsr9"] Sep 30 17:43:31 crc kubenswrapper[4778]: I0930 17:43:31.071154 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e75d-account-create-ntsr9"] Sep 30 17:43:31 crc kubenswrapper[4778]: I0930 17:43:31.728825 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd" path="/var/lib/kubelet/pods/a36c1486-6ed5-4e61-8cf1-9c5e2e86c2cd/volumes" Sep 30 17:43:32 crc kubenswrapper[4778]: I0930 17:43:32.027647 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b3b-account-create-tkcf4"] Sep 30 17:43:32 crc kubenswrapper[4778]: I0930 17:43:32.034160 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b3b-account-create-tkcf4"] Sep 30 17:43:33 crc kubenswrapper[4778]: I0930 17:43:33.736492 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0f118d4-4773-4b15-9a0a-ad44742e1789" path="/var/lib/kubelet/pods/f0f118d4-4773-4b15-9a0a-ad44742e1789/volumes" Sep 30 17:43:45 crc kubenswrapper[4778]: I0930 17:43:45.918458 4778 scope.go:117] "RemoveContainer" containerID="a208f50b480873a95044cd8c3b446c771723ff42dfd34ca6bb4bf2a09f7d0673" Sep 30 17:43:45 crc kubenswrapper[4778]: I0930 17:43:45.939528 4778 scope.go:117] "RemoveContainer" containerID="b6a36cc0389c35e552f3e26ab13b21d115aac0b76a03ebdc379e7ee64e3e9c7d" Sep 30 17:43:46 crc kubenswrapper[4778]: I0930 17:43:46.023271 4778 scope.go:117] "RemoveContainer" containerID="66a07cca25e1471cfffb95e97d9fca21966fbbd1e9c049eaa3448c648b688da0" Sep 30 17:43:46 crc kubenswrapper[4778]: I0930 17:43:46.079844 4778 scope.go:117] "RemoveContainer" containerID="f559aa002ba988092e649881bf7cd55402dca4a53e799efe53f1b8acd7e9d7b6" Sep 30 17:43:46 crc kubenswrapper[4778]: I0930 17:43:46.109859 4778 scope.go:117] "RemoveContainer" containerID="88891d050159c2339358a233f6b0de5f5824a47b1f78435b4ca2b78a9d24be88" Sep 30 17:43:46 crc kubenswrapper[4778]: I0930 17:43:46.131458 4778 scope.go:117] "RemoveContainer" containerID="ed4e2bbff3d0e1e73a3305bf44e27ea16089afd45fa23d87649fe8e2688703ec" Sep 30 17:43:46 crc kubenswrapper[4778]: I0930 17:43:46.164835 4778 scope.go:117] "RemoveContainer" containerID="732f1d8ff0a3c72e429a3d061e666cdc934cb83c356945dc4d07abce0527e2c9" Sep 30 17:43:47 crc kubenswrapper[4778]: I0930 17:43:47.031098 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-dda9-account-create-dnfpr"] Sep 30 17:43:47 crc kubenswrapper[4778]: I0930 17:43:47.043128 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-dda9-account-create-dnfpr"] Sep 30 17:43:47 crc kubenswrapper[4778]: I0930 17:43:47.728078 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b9285b1-4e8e-4600-9dbb-1b0bf16a742d" path="/var/lib/kubelet/pods/7b9285b1-4e8e-4600-9dbb-1b0bf16a742d/volumes" Sep 30 17:43:52 crc kubenswrapper[4778]: I0930 17:43:52.042988 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2t8zc"] Sep 30 17:43:52 crc kubenswrapper[4778]: I0930 17:43:52.057036 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-tgmpn"] Sep 30 17:43:52 crc kubenswrapper[4778]: I0930 17:43:52.070559 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-tgmpn"] Sep 30 17:43:52 crc kubenswrapper[4778]: I0930 17:43:52.079063 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2t8zc"] Sep 30 17:43:53 crc kubenswrapper[4778]: I0930 17:43:53.732562 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90afd016-0994-4d25-ae04-1609f9b811b0" path="/var/lib/kubelet/pods/90afd016-0994-4d25-ae04-1609f9b811b0/volumes" Sep 30 17:43:53 crc kubenswrapper[4778]: I0930 17:43:53.734220 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17e6a20-e807-4f0b-8a17-551f2c547ae5" path="/var/lib/kubelet/pods/d17e6a20-e807-4f0b-8a17-551f2c547ae5/volumes" Sep 30 17:43:56 crc kubenswrapper[4778]: I0930 17:43:56.046599 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vf8bf"] Sep 30 17:43:56 crc kubenswrapper[4778]: I0930 17:43:56.063291 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vf8bf"] Sep 30 17:43:57 crc kubenswrapper[4778]: I0930 17:43:57.735564 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e327b2a5-293e-472e-a1e8-e25f01d54232" path="/var/lib/kubelet/pods/e327b2a5-293e-472e-a1e8-e25f01d54232/volumes" Sep 30 17:44:05 crc kubenswrapper[4778]: I0930 17:44:05.061696 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8860-account-create-jp7dv"] Sep 30 17:44:05 crc kubenswrapper[4778]: I0930 17:44:05.073766 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8860-account-create-jp7dv"] Sep 30 17:44:05 crc kubenswrapper[4778]: I0930 17:44:05.728858 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615fbb2f-3311-4f19-9ffa-6c01b802ae13" path="/var/lib/kubelet/pods/615fbb2f-3311-4f19-9ffa-6c01b802ae13/volumes" Sep 30 17:44:10 crc kubenswrapper[4778]: I0930 17:44:10.029921 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f16f-account-create-n47bv"] Sep 30 17:44:10 crc kubenswrapper[4778]: I0930 17:44:10.038212 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f16f-account-create-n47bv"] Sep 30 17:44:11 crc kubenswrapper[4778]: I0930 17:44:11.727743 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4206afc7-5079-41a3-887c-62f6083aa72c" path="/var/lib/kubelet/pods/4206afc7-5079-41a3-887c-62f6083aa72c/volumes" Sep 30 17:44:16 crc kubenswrapper[4778]: I0930 17:44:16.043612 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pk9vz"] Sep 30 17:44:16 crc kubenswrapper[4778]: I0930 17:44:16.057887 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pk9vz"] Sep 30 17:44:17 crc kubenswrapper[4778]: I0930 17:44:17.726170 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0280205-5cc0-45e5-8f2d-926b300cb348" path="/var/lib/kubelet/pods/e0280205-5cc0-45e5-8f2d-926b300cb348/volumes" Sep 30 17:44:21 crc kubenswrapper[4778]: I0930 17:44:21.043068 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-fmzbx"] Sep 30 17:44:21 crc kubenswrapper[4778]: I0930 17:44:21.051274 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-fmzbx"] Sep 30 17:44:21 crc kubenswrapper[4778]: I0930 17:44:21.723628 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d593f115-5fd5-4db3-9244-8c90696317c4" path="/var/lib/kubelet/pods/d593f115-5fd5-4db3-9244-8c90696317c4/volumes" Sep 30 17:44:40 crc kubenswrapper[4778]: I0930 17:44:40.038518 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pz5ss"] Sep 30 17:44:40 crc kubenswrapper[4778]: I0930 17:44:40.045877 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pz5ss"] Sep 30 17:44:41 crc kubenswrapper[4778]: I0930 17:44:41.731806 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92387610-886b-4105-8de1-87fa92e2215e" path="/var/lib/kubelet/pods/92387610-886b-4105-8de1-87fa92e2215e/volumes" Sep 30 17:44:44 crc kubenswrapper[4778]: I0930 17:44:44.812399 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:44:44 crc kubenswrapper[4778]: I0930 17:44:44.813380 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.038316 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-nz4tj"] Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.051958 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-nz4tj"] Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.316125 4778 scope.go:117] "RemoveContainer" containerID="6389067a01abab59ea73d0ec759ad17247ce56cb7707498b0a78930b1e18f000" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.358919 4778 scope.go:117] "RemoveContainer" containerID="e64124d4ae4a42703521c628147e5049512a8e8b4191312ed43b78166529eb40" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.422313 4778 scope.go:117] "RemoveContainer" containerID="22808c1658a4e26f82cff60e782fee9587b2047e51b29ff6a19d97a701a24271" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.494922 4778 scope.go:117] "RemoveContainer" containerID="6b8c982a5f71a8b2b431ce0aa74b74196e33f6b4908a7617b910d37c683f803a" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.521326 4778 scope.go:117] "RemoveContainer" containerID="f135482638939de05acdcb30bc45fe93fe6c629b8c27502568aa448986935ace" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.573813 4778 scope.go:117] "RemoveContainer" containerID="4fa95329fed631df5f5d87fd9e237641382877d7dee074994d1a3e36e010bde5" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.594978 4778 scope.go:117] "RemoveContainer" containerID="08d85a2e1cdaa63fa3ff21fc3cae356be68eaa98a557e65636e331fbfbc8f25b" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.616125 4778 scope.go:117] "RemoveContainer" containerID="306c584e17e3861c365bcb461e949b8422b905692509357c61a7828cfb7505e3" Sep 30 17:44:46 crc kubenswrapper[4778]: I0930 17:44:46.640991 4778 scope.go:117] "RemoveContainer" containerID="c4f55799c611a9e25ba2d7b495b5641106b4dcdc0d1b67e66ec29cf003ce0155" Sep 30 17:44:47 crc kubenswrapper[4778]: I0930 17:44:47.743665 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fcfd6aa-7524-4fb2-8230-f35ed39691a9" path="/var/lib/kubelet/pods/9fcfd6aa-7524-4fb2-8230-f35ed39691a9/volumes" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.180362 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74"] Sep 30 17:45:00 crc kubenswrapper[4778]: E0930 17:45:00.181883 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="registry-server" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.181913 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="registry-server" Sep 30 17:45:00 crc kubenswrapper[4778]: E0930 17:45:00.181992 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="extract-content" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.182009 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="extract-content" Sep 30 17:45:00 crc kubenswrapper[4778]: E0930 17:45:00.182037 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="extract-utilities" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.182057 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="extract-utilities" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.182521 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="274011cd-4114-43df-8623-c03d1911c27b" containerName="registry-server" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.186433 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.188958 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.188956 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.207814 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74"] Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.298417 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqr96\" (UniqueName: \"kubernetes.io/projected/3e60d969-0114-4453-b70b-1084cb9848b5-kube-api-access-nqr96\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.298476 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60d969-0114-4453-b70b-1084cb9848b5-config-volume\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.298768 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60d969-0114-4453-b70b-1084cb9848b5-secret-volume\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.400430 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60d969-0114-4453-b70b-1084cb9848b5-secret-volume\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.400508 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqr96\" (UniqueName: \"kubernetes.io/projected/3e60d969-0114-4453-b70b-1084cb9848b5-kube-api-access-nqr96\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.400528 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60d969-0114-4453-b70b-1084cb9848b5-config-volume\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.401786 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60d969-0114-4453-b70b-1084cb9848b5-config-volume\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.407272 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60d969-0114-4453-b70b-1084cb9848b5-secret-volume\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.419531 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqr96\" (UniqueName: \"kubernetes.io/projected/3e60d969-0114-4453-b70b-1084cb9848b5-kube-api-access-nqr96\") pod \"collect-profiles-29320905-lcs74\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.509110 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:00 crc kubenswrapper[4778]: I0930 17:45:00.972997 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74"] Sep 30 17:45:00 crc kubenswrapper[4778]: W0930 17:45:00.973177 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e60d969_0114_4453_b70b_1084cb9848b5.slice/crio-7ba8fa8c8292b6658e63360695fc59a491b3ef0c12bfc0bf04fa47739a3ccf58 WatchSource:0}: Error finding container 7ba8fa8c8292b6658e63360695fc59a491b3ef0c12bfc0bf04fa47739a3ccf58: Status 404 returned error can't find the container with id 7ba8fa8c8292b6658e63360695fc59a491b3ef0c12bfc0bf04fa47739a3ccf58 Sep 30 17:45:01 crc kubenswrapper[4778]: I0930 17:45:01.045797 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fsrx9"] Sep 30 17:45:01 crc kubenswrapper[4778]: I0930 17:45:01.052278 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fsrx9"] Sep 30 17:45:01 crc kubenswrapper[4778]: I0930 17:45:01.117419 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" event={"ID":"3e60d969-0114-4453-b70b-1084cb9848b5","Type":"ContainerStarted","Data":"7ba8fa8c8292b6658e63360695fc59a491b3ef0c12bfc0bf04fa47739a3ccf58"} Sep 30 17:45:01 crc kubenswrapper[4778]: I0930 17:45:01.730863 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef0e458-8d92-4b76-8da4-24357b6911dc" path="/var/lib/kubelet/pods/3ef0e458-8d92-4b76-8da4-24357b6911dc/volumes" Sep 30 17:45:02 crc kubenswrapper[4778]: I0930 17:45:02.130049 4778 generic.go:334] "Generic (PLEG): container finished" podID="3e60d969-0114-4453-b70b-1084cb9848b5" containerID="b195716fd9868c2fe0b8114062061e977f178486112dc46ed725b3fe71294ddb" exitCode=0 Sep 30 17:45:02 crc kubenswrapper[4778]: I0930 17:45:02.130102 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" event={"ID":"3e60d969-0114-4453-b70b-1084cb9848b5","Type":"ContainerDied","Data":"b195716fd9868c2fe0b8114062061e977f178486112dc46ed725b3fe71294ddb"} Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.551884 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.664963 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60d969-0114-4453-b70b-1084cb9848b5-secret-volume\") pod \"3e60d969-0114-4453-b70b-1084cb9848b5\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.665033 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60d969-0114-4453-b70b-1084cb9848b5-config-volume\") pod \"3e60d969-0114-4453-b70b-1084cb9848b5\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.665299 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqr96\" (UniqueName: \"kubernetes.io/projected/3e60d969-0114-4453-b70b-1084cb9848b5-kube-api-access-nqr96\") pod \"3e60d969-0114-4453-b70b-1084cb9848b5\" (UID: \"3e60d969-0114-4453-b70b-1084cb9848b5\") " Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.665843 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e60d969-0114-4453-b70b-1084cb9848b5-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e60d969-0114-4453-b70b-1084cb9848b5" (UID: "3e60d969-0114-4453-b70b-1084cb9848b5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.670667 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e60d969-0114-4453-b70b-1084cb9848b5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e60d969-0114-4453-b70b-1084cb9848b5" (UID: "3e60d969-0114-4453-b70b-1084cb9848b5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.670980 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e60d969-0114-4453-b70b-1084cb9848b5-kube-api-access-nqr96" (OuterVolumeSpecName: "kube-api-access-nqr96") pod "3e60d969-0114-4453-b70b-1084cb9848b5" (UID: "3e60d969-0114-4453-b70b-1084cb9848b5"). InnerVolumeSpecName "kube-api-access-nqr96". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.768253 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e60d969-0114-4453-b70b-1084cb9848b5-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.768321 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e60d969-0114-4453-b70b-1084cb9848b5-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:03 crc kubenswrapper[4778]: I0930 17:45:03.768351 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqr96\" (UniqueName: \"kubernetes.io/projected/3e60d969-0114-4453-b70b-1084cb9848b5-kube-api-access-nqr96\") on node \"crc\" DevicePath \"\"" Sep 30 17:45:04 crc kubenswrapper[4778]: I0930 17:45:04.155697 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" event={"ID":"3e60d969-0114-4453-b70b-1084cb9848b5","Type":"ContainerDied","Data":"7ba8fa8c8292b6658e63360695fc59a491b3ef0c12bfc0bf04fa47739a3ccf58"} Sep 30 17:45:04 crc kubenswrapper[4778]: I0930 17:45:04.155768 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ba8fa8c8292b6658e63360695fc59a491b3ef0c12bfc0bf04fa47739a3ccf58" Sep 30 17:45:04 crc kubenswrapper[4778]: I0930 17:45:04.156292 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320905-lcs74" Sep 30 17:45:14 crc kubenswrapper[4778]: I0930 17:45:14.812126 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:45:14 crc kubenswrapper[4778]: I0930 17:45:14.812944 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:45:26 crc kubenswrapper[4778]: I0930 17:45:26.039224 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-d9t4c"] Sep 30 17:45:26 crc kubenswrapper[4778]: I0930 17:45:26.048955 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-26rxw"] Sep 30 17:45:26 crc kubenswrapper[4778]: I0930 17:45:26.064253 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7qsbn"] Sep 30 17:45:26 crc kubenswrapper[4778]: I0930 17:45:26.074494 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-26rxw"] Sep 30 17:45:26 crc kubenswrapper[4778]: I0930 17:45:26.083192 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-d9t4c"] Sep 30 17:45:26 crc kubenswrapper[4778]: I0930 17:45:26.090862 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7qsbn"] Sep 30 17:45:27 crc kubenswrapper[4778]: I0930 17:45:27.727650 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ee6faf7-113e-4545-b3d5-e29e611c4f3e" path="/var/lib/kubelet/pods/2ee6faf7-113e-4545-b3d5-e29e611c4f3e/volumes" Sep 30 17:45:27 crc kubenswrapper[4778]: I0930 17:45:27.728747 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65194693-743e-4b92-9bb3-3a9e0ea49273" path="/var/lib/kubelet/pods/65194693-743e-4b92-9bb3-3a9e0ea49273/volumes" Sep 30 17:45:27 crc kubenswrapper[4778]: I0930 17:45:27.729288 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a06f678-ac41-4482-8889-c85b2beb6703" path="/var/lib/kubelet/pods/7a06f678-ac41-4482-8889-c85b2beb6703/volumes" Sep 30 17:45:32 crc kubenswrapper[4778]: I0930 17:45:32.061780 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-80a3-account-create-hjhw5"] Sep 30 17:45:32 crc kubenswrapper[4778]: I0930 17:45:32.074204 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-80a3-account-create-hjhw5"] Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.057149 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-e68e-account-create-2z8ks"] Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.074235 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e14b-account-create-ssj7x"] Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.097416 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e14b-account-create-ssj7x"] Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.109961 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-e68e-account-create-2z8ks"] Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.726842 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="330a8b1b-c4dc-40d1-9af3-ba2d151803aa" path="/var/lib/kubelet/pods/330a8b1b-c4dc-40d1-9af3-ba2d151803aa/volumes" Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.727877 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db" path="/var/lib/kubelet/pods/b0646dac-3ae8-43a0-9a9a-6d0d6ad2f6db/volumes" Sep 30 17:45:33 crc kubenswrapper[4778]: I0930 17:45:33.728507 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2812856-059f-4b6c-975d-7fd7afcdcd05" path="/var/lib/kubelet/pods/c2812856-059f-4b6c-975d-7fd7afcdcd05/volumes" Sep 30 17:45:44 crc kubenswrapper[4778]: I0930 17:45:44.811726 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:45:44 crc kubenswrapper[4778]: I0930 17:45:44.812578 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:45:44 crc kubenswrapper[4778]: I0930 17:45:44.812678 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:45:44 crc kubenswrapper[4778]: I0930 17:45:44.813829 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:45:44 crc kubenswrapper[4778]: I0930 17:45:44.813945 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" gracePeriod=600 Sep 30 17:45:44 crc kubenswrapper[4778]: E0930 17:45:44.948025 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:45:45 crc kubenswrapper[4778]: I0930 17:45:45.600123 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312"} Sep 30 17:45:45 crc kubenswrapper[4778]: I0930 17:45:45.600218 4778 scope.go:117] "RemoveContainer" containerID="9bb2b540381c42fbc380e8cd8f4bc8733104d08b1605c2c2155245fff3463fd6" Sep 30 17:45:45 crc kubenswrapper[4778]: I0930 17:45:45.600056 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" exitCode=0 Sep 30 17:45:45 crc kubenswrapper[4778]: I0930 17:45:45.601027 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:45:45 crc kubenswrapper[4778]: E0930 17:45:45.601470 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:45:46 crc kubenswrapper[4778]: I0930 17:45:46.803946 4778 scope.go:117] "RemoveContainer" containerID="eec7da21c92f3dcc0bde7a14596a5ddf1b71d8679514e4c0ffa4b79eb6ad9971" Sep 30 17:45:46 crc kubenswrapper[4778]: I0930 17:45:46.836367 4778 scope.go:117] "RemoveContainer" containerID="0cb1f5e4af2e98ea8536703ab065645ebe25a13bacf52de3595ee646acd2e076" Sep 30 17:45:46 crc kubenswrapper[4778]: I0930 17:45:46.911147 4778 scope.go:117] "RemoveContainer" containerID="339d10f03296fa4ecb05b496d6d8a27094539ab0c9299a9669948aee89faae53" Sep 30 17:45:46 crc kubenswrapper[4778]: I0930 17:45:46.935575 4778 scope.go:117] "RemoveContainer" containerID="1e90c849502d652235be13096d7c92503893b8a572bd1b90f8686a8bb2782454" Sep 30 17:45:46 crc kubenswrapper[4778]: I0930 17:45:46.972957 4778 scope.go:117] "RemoveContainer" containerID="d09d4fe9810398410fd612bca02872713eef7cd44997bab9fe75a6bd52e935db" Sep 30 17:45:47 crc kubenswrapper[4778]: I0930 17:45:47.041501 4778 scope.go:117] "RemoveContainer" containerID="557d0694d8384b4f06f8f9a6206ea4b6f52ba33153491a5e3568b9fa555d9868" Sep 30 17:45:47 crc kubenswrapper[4778]: I0930 17:45:47.070641 4778 scope.go:117] "RemoveContainer" containerID="0d626821f9389abdcfb4d97202069113d6c13603fcfac4f875d8ad7fd084e962" Sep 30 17:45:47 crc kubenswrapper[4778]: I0930 17:45:47.098697 4778 scope.go:117] "RemoveContainer" containerID="b48cedf96a808739d2166f3d99b58dae88e42ebd7548a34252efb5d4f341ee35" Sep 30 17:45:54 crc kubenswrapper[4778]: I0930 17:45:54.030279 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tcrrp"] Sep 30 17:45:54 crc kubenswrapper[4778]: I0930 17:45:54.042566 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-tcrrp"] Sep 30 17:45:55 crc kubenswrapper[4778]: I0930 17:45:55.726483 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26eefd11-813a-4df9-831d-0a9e60b1be73" path="/var/lib/kubelet/pods/26eefd11-813a-4df9-831d-0a9e60b1be73/volumes" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.612680 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nggr5"] Sep 30 17:45:56 crc kubenswrapper[4778]: E0930 17:45:56.613444 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e60d969-0114-4453-b70b-1084cb9848b5" containerName="collect-profiles" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.613463 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e60d969-0114-4453-b70b-1084cb9848b5" containerName="collect-profiles" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.613654 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e60d969-0114-4453-b70b-1084cb9848b5" containerName="collect-profiles" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.615232 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.624920 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nggr5"] Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.691347 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnj8\" (UniqueName: \"kubernetes.io/projected/803cc6c3-b46f-4dbe-be84-659b4de02e9a-kube-api-access-fnnj8\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.691671 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-utilities\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.691817 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-catalog-content\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.794406 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnj8\" (UniqueName: \"kubernetes.io/projected/803cc6c3-b46f-4dbe-be84-659b4de02e9a-kube-api-access-fnnj8\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.795543 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-utilities\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.796295 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-utilities\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.796576 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-catalog-content\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.797054 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-catalog-content\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.823511 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnj8\" (UniqueName: \"kubernetes.io/projected/803cc6c3-b46f-4dbe-be84-659b4de02e9a-kube-api-access-fnnj8\") pod \"redhat-operators-nggr5\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:56 crc kubenswrapper[4778]: I0930 17:45:56.961108 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:45:57 crc kubenswrapper[4778]: I0930 17:45:57.211398 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nggr5"] Sep 30 17:45:57 crc kubenswrapper[4778]: I0930 17:45:57.723055 4778 generic.go:334] "Generic (PLEG): container finished" podID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerID="8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc" exitCode=0 Sep 30 17:45:57 crc kubenswrapper[4778]: I0930 17:45:57.727603 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nggr5" event={"ID":"803cc6c3-b46f-4dbe-be84-659b4de02e9a","Type":"ContainerDied","Data":"8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc"} Sep 30 17:45:57 crc kubenswrapper[4778]: I0930 17:45:57.728468 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nggr5" event={"ID":"803cc6c3-b46f-4dbe-be84-659b4de02e9a","Type":"ContainerStarted","Data":"87bd40c955cde1fef9f14f6779d9033199cffe69f87353e76a29a778dd9de73a"} Sep 30 17:45:58 crc kubenswrapper[4778]: I0930 17:45:58.715131 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:45:58 crc kubenswrapper[4778]: E0930 17:45:58.717320 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:45:59 crc kubenswrapper[4778]: I0930 17:45:59.750756 4778 generic.go:334] "Generic (PLEG): container finished" podID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerID="dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416" exitCode=0 Sep 30 17:45:59 crc kubenswrapper[4778]: I0930 17:45:59.750858 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nggr5" event={"ID":"803cc6c3-b46f-4dbe-be84-659b4de02e9a","Type":"ContainerDied","Data":"dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416"} Sep 30 17:46:00 crc kubenswrapper[4778]: I0930 17:46:00.763076 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nggr5" event={"ID":"803cc6c3-b46f-4dbe-be84-659b4de02e9a","Type":"ContainerStarted","Data":"aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4"} Sep 30 17:46:00 crc kubenswrapper[4778]: I0930 17:46:00.794365 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nggr5" podStartSLOduration=2.321251077 podStartE2EDuration="4.794342324s" podCreationTimestamp="2025-09-30 17:45:56 +0000 UTC" firstStartedPulling="2025-09-30 17:45:57.725890491 +0000 UTC m=+1696.715788294" lastFinishedPulling="2025-09-30 17:46:00.198981708 +0000 UTC m=+1699.188879541" observedRunningTime="2025-09-30 17:46:00.790990479 +0000 UTC m=+1699.780888282" watchObservedRunningTime="2025-09-30 17:46:00.794342324 +0000 UTC m=+1699.784240147" Sep 30 17:46:06 crc kubenswrapper[4778]: I0930 17:46:06.961439 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:46:06 crc kubenswrapper[4778]: I0930 17:46:06.962218 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:46:07 crc kubenswrapper[4778]: I0930 17:46:07.025399 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:46:07 crc kubenswrapper[4778]: I0930 17:46:07.919164 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:46:07 crc kubenswrapper[4778]: I0930 17:46:07.990687 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nggr5"] Sep 30 17:46:09 crc kubenswrapper[4778]: I0930 17:46:09.714179 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:46:09 crc kubenswrapper[4778]: E0930 17:46:09.714798 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:46:09 crc kubenswrapper[4778]: I0930 17:46:09.861433 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nggr5" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="registry-server" containerID="cri-o://aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4" gracePeriod=2 Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.310181 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.357675 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnnj8\" (UniqueName: \"kubernetes.io/projected/803cc6c3-b46f-4dbe-be84-659b4de02e9a-kube-api-access-fnnj8\") pod \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.357821 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-catalog-content\") pod \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.358011 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-utilities\") pod \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\" (UID: \"803cc6c3-b46f-4dbe-be84-659b4de02e9a\") " Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.358990 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-utilities" (OuterVolumeSpecName: "utilities") pod "803cc6c3-b46f-4dbe-be84-659b4de02e9a" (UID: "803cc6c3-b46f-4dbe-be84-659b4de02e9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.363093 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803cc6c3-b46f-4dbe-be84-659b4de02e9a-kube-api-access-fnnj8" (OuterVolumeSpecName: "kube-api-access-fnnj8") pod "803cc6c3-b46f-4dbe-be84-659b4de02e9a" (UID: "803cc6c3-b46f-4dbe-be84-659b4de02e9a"). InnerVolumeSpecName "kube-api-access-fnnj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.429512 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "803cc6c3-b46f-4dbe-be84-659b4de02e9a" (UID: "803cc6c3-b46f-4dbe-be84-659b4de02e9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.460153 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.460198 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnnj8\" (UniqueName: \"kubernetes.io/projected/803cc6c3-b46f-4dbe-be84-659b4de02e9a-kube-api-access-fnnj8\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.460211 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/803cc6c3-b46f-4dbe-be84-659b4de02e9a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.877697 4778 generic.go:334] "Generic (PLEG): container finished" podID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerID="aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4" exitCode=0 Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.877762 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nggr5" event={"ID":"803cc6c3-b46f-4dbe-be84-659b4de02e9a","Type":"ContainerDied","Data":"aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4"} Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.878095 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nggr5" event={"ID":"803cc6c3-b46f-4dbe-be84-659b4de02e9a","Type":"ContainerDied","Data":"87bd40c955cde1fef9f14f6779d9033199cffe69f87353e76a29a778dd9de73a"} Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.878129 4778 scope.go:117] "RemoveContainer" containerID="aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.877810 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nggr5" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.908246 4778 scope.go:117] "RemoveContainer" containerID="dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.925790 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nggr5"] Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.940350 4778 scope.go:117] "RemoveContainer" containerID="8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc" Sep 30 17:46:10 crc kubenswrapper[4778]: I0930 17:46:10.942324 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nggr5"] Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.017072 4778 scope.go:117] "RemoveContainer" containerID="aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4" Sep 30 17:46:11 crc kubenswrapper[4778]: E0930 17:46:11.017660 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4\": container with ID starting with aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4 not found: ID does not exist" containerID="aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.017704 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4"} err="failed to get container status \"aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4\": rpc error: code = NotFound desc = could not find container \"aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4\": container with ID starting with aeefca6aec3a338e044d436cb040bfe44ba9b3c387b7a929039fb46ebffc19f4 not found: ID does not exist" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.017735 4778 scope.go:117] "RemoveContainer" containerID="dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416" Sep 30 17:46:11 crc kubenswrapper[4778]: E0930 17:46:11.018120 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416\": container with ID starting with dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416 not found: ID does not exist" containerID="dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.018161 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416"} err="failed to get container status \"dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416\": rpc error: code = NotFound desc = could not find container \"dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416\": container with ID starting with dbd852e99402e5c0d903736d682d662744ce0eea3c6e7cf9a1f8d04f40a22416 not found: ID does not exist" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.018188 4778 scope.go:117] "RemoveContainer" containerID="8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc" Sep 30 17:46:11 crc kubenswrapper[4778]: E0930 17:46:11.019450 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc\": container with ID starting with 8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc not found: ID does not exist" containerID="8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.019479 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc"} err="failed to get container status \"8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc\": rpc error: code = NotFound desc = could not find container \"8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc\": container with ID starting with 8bf995fdc93788cc322fb7fe4ab574b4e8c12cb6164e98fc8f616c7b38ab34dc not found: ID does not exist" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.053155 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-qf925"] Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.061813 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-qf925"] Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.724852 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" path="/var/lib/kubelet/pods/803cc6c3-b46f-4dbe-be84-659b4de02e9a/volumes" Sep 30 17:46:11 crc kubenswrapper[4778]: I0930 17:46:11.725524 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4b7091-db1d-4b92-9009-ccafd842d405" path="/var/lib/kubelet/pods/ca4b7091-db1d-4b92-9009-ccafd842d405/volumes" Sep 30 17:46:12 crc kubenswrapper[4778]: I0930 17:46:12.033666 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7pc2n"] Sep 30 17:46:12 crc kubenswrapper[4778]: I0930 17:46:12.045323 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7pc2n"] Sep 30 17:46:13 crc kubenswrapper[4778]: I0930 17:46:13.733266 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6" path="/var/lib/kubelet/pods/4ef25b93-966b-4fc0-85e9-9d6c1cbfb9b6/volumes" Sep 30 17:46:20 crc kubenswrapper[4778]: I0930 17:46:20.715308 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:46:20 crc kubenswrapper[4778]: E0930 17:46:20.716330 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:46:32 crc kubenswrapper[4778]: I0930 17:46:32.714018 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:46:32 crc kubenswrapper[4778]: E0930 17:46:32.715022 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:46:45 crc kubenswrapper[4778]: I0930 17:46:45.715584 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:46:45 crc kubenswrapper[4778]: E0930 17:46:45.716887 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:46:47 crc kubenswrapper[4778]: I0930 17:46:47.291331 4778 scope.go:117] "RemoveContainer" containerID="419f04cd88fee8958a7bee7106a0501124468df502b036ffa77804cd1497f03a" Sep 30 17:46:47 crc kubenswrapper[4778]: I0930 17:46:47.356261 4778 scope.go:117] "RemoveContainer" containerID="50699ae44e61c9c97eb836537114a703db357385e7a871995a01d573e5661fa9" Sep 30 17:46:47 crc kubenswrapper[4778]: I0930 17:46:47.398416 4778 scope.go:117] "RemoveContainer" containerID="9d0e17aa62562f2605b6e9bc77b22cf7df4657dffc7b25b0b126e77378898822" Sep 30 17:46:57 crc kubenswrapper[4778]: I0930 17:46:57.066575 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-mt5l7"] Sep 30 17:46:57 crc kubenswrapper[4778]: I0930 17:46:57.080647 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-mt5l7"] Sep 30 17:46:57 crc kubenswrapper[4778]: I0930 17:46:57.729276 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7268e8e0-16cf-4b12-8593-e579df617a0e" path="/var/lib/kubelet/pods/7268e8e0-16cf-4b12-8593-e579df617a0e/volumes" Sep 30 17:46:59 crc kubenswrapper[4778]: I0930 17:46:59.714711 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:46:59 crc kubenswrapper[4778]: E0930 17:46:59.715222 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:47:13 crc kubenswrapper[4778]: I0930 17:47:13.714483 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:47:13 crc kubenswrapper[4778]: E0930 17:47:13.715110 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:47:27 crc kubenswrapper[4778]: I0930 17:47:27.714763 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:47:27 crc kubenswrapper[4778]: E0930 17:47:27.715770 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:47:39 crc kubenswrapper[4778]: I0930 17:47:39.714109 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:47:39 crc kubenswrapper[4778]: E0930 17:47:39.714945 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:47:47 crc kubenswrapper[4778]: I0930 17:47:47.539738 4778 scope.go:117] "RemoveContainer" containerID="f7c488c44759fad25c0dc4cf6e61440cc324771c957d9535029796d90cfbe524" Sep 30 17:47:53 crc kubenswrapper[4778]: I0930 17:47:53.715043 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:47:53 crc kubenswrapper[4778]: E0930 17:47:53.715971 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:48:04 crc kubenswrapper[4778]: I0930 17:48:04.714232 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:48:04 crc kubenswrapper[4778]: E0930 17:48:04.715543 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:48:17 crc kubenswrapper[4778]: I0930 17:48:17.714149 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:48:17 crc kubenswrapper[4778]: E0930 17:48:17.714886 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:48:32 crc kubenswrapper[4778]: I0930 17:48:32.713853 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:48:32 crc kubenswrapper[4778]: E0930 17:48:32.714934 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:48:43 crc kubenswrapper[4778]: I0930 17:48:43.714600 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:48:43 crc kubenswrapper[4778]: E0930 17:48:43.715755 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:48:58 crc kubenswrapper[4778]: I0930 17:48:58.714828 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:48:58 crc kubenswrapper[4778]: E0930 17:48:58.715843 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:49:11 crc kubenswrapper[4778]: I0930 17:49:11.720219 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:49:11 crc kubenswrapper[4778]: E0930 17:49:11.721104 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:49:24 crc kubenswrapper[4778]: I0930 17:49:24.714844 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:49:24 crc kubenswrapper[4778]: E0930 17:49:24.716204 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:49:39 crc kubenswrapper[4778]: I0930 17:49:39.714610 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:49:39 crc kubenswrapper[4778]: E0930 17:49:39.715668 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:49:51 crc kubenswrapper[4778]: I0930 17:49:51.723286 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:49:51 crc kubenswrapper[4778]: E0930 17:49:51.724395 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:50:03 crc kubenswrapper[4778]: I0930 17:50:03.714302 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:50:03 crc kubenswrapper[4778]: E0930 17:50:03.715540 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:50:15 crc kubenswrapper[4778]: I0930 17:50:15.714118 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:50:15 crc kubenswrapper[4778]: E0930 17:50:15.714986 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:50:30 crc kubenswrapper[4778]: I0930 17:50:30.714257 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:50:30 crc kubenswrapper[4778]: E0930 17:50:30.715413 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:50:41 crc kubenswrapper[4778]: I0930 17:50:41.720341 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:50:41 crc kubenswrapper[4778]: E0930 17:50:41.722041 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:50:53 crc kubenswrapper[4778]: I0930 17:50:53.714111 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:50:54 crc kubenswrapper[4778]: I0930 17:50:54.668262 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"363873babd277c029b7d9966d143bc81c18d516f3abbaa8796da4f45c82bd4ee"} Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.293869 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sw4rm"] Sep 30 17:52:15 crc kubenswrapper[4778]: E0930 17:52:15.294985 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="registry-server" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.295001 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="registry-server" Sep 30 17:52:15 crc kubenswrapper[4778]: E0930 17:52:15.295014 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="extract-utilities" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.295023 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="extract-utilities" Sep 30 17:52:15 crc kubenswrapper[4778]: E0930 17:52:15.295064 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="extract-content" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.295075 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="extract-content" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.295289 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="803cc6c3-b46f-4dbe-be84-659b4de02e9a" containerName="registry-server" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.296872 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.317913 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw4rm"] Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.441462 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-utilities\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.441515 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-catalog-content\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.441549 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bn4w\" (UniqueName: \"kubernetes.io/projected/c125554c-05a7-476a-8f30-aa8dc6487417-kube-api-access-5bn4w\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.543019 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-utilities\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.543073 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-catalog-content\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.543120 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bn4w\" (UniqueName: \"kubernetes.io/projected/c125554c-05a7-476a-8f30-aa8dc6487417-kube-api-access-5bn4w\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.543790 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-utilities\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.544019 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-catalog-content\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.576805 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bn4w\" (UniqueName: \"kubernetes.io/projected/c125554c-05a7-476a-8f30-aa8dc6487417-kube-api-access-5bn4w\") pod \"redhat-marketplace-sw4rm\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:15 crc kubenswrapper[4778]: I0930 17:52:15.656319 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:16 crc kubenswrapper[4778]: I0930 17:52:16.139387 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw4rm"] Sep 30 17:52:16 crc kubenswrapper[4778]: I0930 17:52:16.435742 4778 generic.go:334] "Generic (PLEG): container finished" podID="c125554c-05a7-476a-8f30-aa8dc6487417" containerID="75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd" exitCode=0 Sep 30 17:52:16 crc kubenswrapper[4778]: I0930 17:52:16.435800 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw4rm" event={"ID":"c125554c-05a7-476a-8f30-aa8dc6487417","Type":"ContainerDied","Data":"75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd"} Sep 30 17:52:16 crc kubenswrapper[4778]: I0930 17:52:16.435835 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw4rm" event={"ID":"c125554c-05a7-476a-8f30-aa8dc6487417","Type":"ContainerStarted","Data":"f9dc9c257e595de4ae71f9802136e5f59af5d61ad74ecac4fd7fedb4faa52c7e"} Sep 30 17:52:16 crc kubenswrapper[4778]: I0930 17:52:16.439875 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 17:52:18 crc kubenswrapper[4778]: I0930 17:52:18.458817 4778 generic.go:334] "Generic (PLEG): container finished" podID="c125554c-05a7-476a-8f30-aa8dc6487417" containerID="7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453" exitCode=0 Sep 30 17:52:18 crc kubenswrapper[4778]: I0930 17:52:18.459495 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw4rm" event={"ID":"c125554c-05a7-476a-8f30-aa8dc6487417","Type":"ContainerDied","Data":"7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453"} Sep 30 17:52:19 crc kubenswrapper[4778]: I0930 17:52:19.471510 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw4rm" event={"ID":"c125554c-05a7-476a-8f30-aa8dc6487417","Type":"ContainerStarted","Data":"f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82"} Sep 30 17:52:19 crc kubenswrapper[4778]: I0930 17:52:19.496899 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sw4rm" podStartSLOduration=1.9099073469999999 podStartE2EDuration="4.496880129s" podCreationTimestamp="2025-09-30 17:52:15 +0000 UTC" firstStartedPulling="2025-09-30 17:52:16.438047645 +0000 UTC m=+2075.427945488" lastFinishedPulling="2025-09-30 17:52:19.025020457 +0000 UTC m=+2078.014918270" observedRunningTime="2025-09-30 17:52:19.495077493 +0000 UTC m=+2078.484975336" watchObservedRunningTime="2025-09-30 17:52:19.496880129 +0000 UTC m=+2078.486777942" Sep 30 17:52:25 crc kubenswrapper[4778]: I0930 17:52:25.656821 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:25 crc kubenswrapper[4778]: I0930 17:52:25.658774 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:25 crc kubenswrapper[4778]: I0930 17:52:25.710135 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:26 crc kubenswrapper[4778]: I0930 17:52:26.623240 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:26 crc kubenswrapper[4778]: I0930 17:52:26.696609 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw4rm"] Sep 30 17:52:28 crc kubenswrapper[4778]: I0930 17:52:28.564281 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sw4rm" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="registry-server" containerID="cri-o://f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82" gracePeriod=2 Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.094721 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.110410 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bn4w\" (UniqueName: \"kubernetes.io/projected/c125554c-05a7-476a-8f30-aa8dc6487417-kube-api-access-5bn4w\") pod \"c125554c-05a7-476a-8f30-aa8dc6487417\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.110505 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-utilities\") pod \"c125554c-05a7-476a-8f30-aa8dc6487417\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.110585 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-catalog-content\") pod \"c125554c-05a7-476a-8f30-aa8dc6487417\" (UID: \"c125554c-05a7-476a-8f30-aa8dc6487417\") " Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.111653 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-utilities" (OuterVolumeSpecName: "utilities") pod "c125554c-05a7-476a-8f30-aa8dc6487417" (UID: "c125554c-05a7-476a-8f30-aa8dc6487417"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.117679 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c125554c-05a7-476a-8f30-aa8dc6487417-kube-api-access-5bn4w" (OuterVolumeSpecName: "kube-api-access-5bn4w") pod "c125554c-05a7-476a-8f30-aa8dc6487417" (UID: "c125554c-05a7-476a-8f30-aa8dc6487417"). InnerVolumeSpecName "kube-api-access-5bn4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.135308 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c125554c-05a7-476a-8f30-aa8dc6487417" (UID: "c125554c-05a7-476a-8f30-aa8dc6487417"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.212879 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.212933 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bn4w\" (UniqueName: \"kubernetes.io/projected/c125554c-05a7-476a-8f30-aa8dc6487417-kube-api-access-5bn4w\") on node \"crc\" DevicePath \"\"" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.212950 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c125554c-05a7-476a-8f30-aa8dc6487417-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.573762 4778 generic.go:334] "Generic (PLEG): container finished" podID="c125554c-05a7-476a-8f30-aa8dc6487417" containerID="f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82" exitCode=0 Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.573806 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw4rm" event={"ID":"c125554c-05a7-476a-8f30-aa8dc6487417","Type":"ContainerDied","Data":"f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82"} Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.573831 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sw4rm" event={"ID":"c125554c-05a7-476a-8f30-aa8dc6487417","Type":"ContainerDied","Data":"f9dc9c257e595de4ae71f9802136e5f59af5d61ad74ecac4fd7fedb4faa52c7e"} Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.573849 4778 scope.go:117] "RemoveContainer" containerID="f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.573958 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sw4rm" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.607013 4778 scope.go:117] "RemoveContainer" containerID="7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.634952 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw4rm"] Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.651330 4778 scope.go:117] "RemoveContainer" containerID="75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.659386 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sw4rm"] Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.692292 4778 scope.go:117] "RemoveContainer" containerID="f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82" Sep 30 17:52:29 crc kubenswrapper[4778]: E0930 17:52:29.692952 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82\": container with ID starting with f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82 not found: ID does not exist" containerID="f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.692988 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82"} err="failed to get container status \"f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82\": rpc error: code = NotFound desc = could not find container \"f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82\": container with ID starting with f5e06cd0315f584de88f3d6accdfea1a746dde690c1ea30770425cbc5d11cf82 not found: ID does not exist" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.693015 4778 scope.go:117] "RemoveContainer" containerID="7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453" Sep 30 17:52:29 crc kubenswrapper[4778]: E0930 17:52:29.693220 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453\": container with ID starting with 7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453 not found: ID does not exist" containerID="7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.693241 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453"} err="failed to get container status \"7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453\": rpc error: code = NotFound desc = could not find container \"7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453\": container with ID starting with 7c944d4e047d760045bdf3496fc17ca8c6650ca5684df10cc244fddff6af9453 not found: ID does not exist" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.693257 4778 scope.go:117] "RemoveContainer" containerID="75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd" Sep 30 17:52:29 crc kubenswrapper[4778]: E0930 17:52:29.693555 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd\": container with ID starting with 75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd not found: ID does not exist" containerID="75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.693585 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd"} err="failed to get container status \"75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd\": rpc error: code = NotFound desc = could not find container \"75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd\": container with ID starting with 75ec55f2ffec027076102ef4e145917ad65b8348857232bc57b22d7a4fec53bd not found: ID does not exist" Sep 30 17:52:29 crc kubenswrapper[4778]: I0930 17:52:29.723994 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" path="/var/lib/kubelet/pods/c125554c-05a7-476a-8f30-aa8dc6487417/volumes" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.064221 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5sqbn"] Sep 30 17:52:34 crc kubenswrapper[4778]: E0930 17:52:34.064961 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="extract-utilities" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.064974 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="extract-utilities" Sep 30 17:52:34 crc kubenswrapper[4778]: E0930 17:52:34.064987 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="extract-content" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.064992 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="extract-content" Sep 30 17:52:34 crc kubenswrapper[4778]: E0930 17:52:34.065032 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="registry-server" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.065038 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="registry-server" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.065223 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="c125554c-05a7-476a-8f30-aa8dc6487417" containerName="registry-server" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.066771 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.090760 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5sqbn"] Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.127078 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-utilities\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.127271 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-catalog-content\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.127364 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psppn\" (UniqueName: \"kubernetes.io/projected/f6f463b6-a6a5-470c-9413-207998f8e2bb-kube-api-access-psppn\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.228708 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-utilities\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.228869 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-catalog-content\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.228955 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psppn\" (UniqueName: \"kubernetes.io/projected/f6f463b6-a6a5-470c-9413-207998f8e2bb-kube-api-access-psppn\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.229324 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-utilities\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.229344 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-catalog-content\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.249908 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psppn\" (UniqueName: \"kubernetes.io/projected/f6f463b6-a6a5-470c-9413-207998f8e2bb-kube-api-access-psppn\") pod \"community-operators-5sqbn\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.410225 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:34 crc kubenswrapper[4778]: I0930 17:52:34.704704 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5sqbn"] Sep 30 17:52:35 crc kubenswrapper[4778]: I0930 17:52:35.635888 4778 generic.go:334] "Generic (PLEG): container finished" podID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerID="806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471" exitCode=0 Sep 30 17:52:35 crc kubenswrapper[4778]: I0930 17:52:35.636008 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sqbn" event={"ID":"f6f463b6-a6a5-470c-9413-207998f8e2bb","Type":"ContainerDied","Data":"806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471"} Sep 30 17:52:35 crc kubenswrapper[4778]: I0930 17:52:35.636441 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sqbn" event={"ID":"f6f463b6-a6a5-470c-9413-207998f8e2bb","Type":"ContainerStarted","Data":"0e43ff0abd2f9c13555f4671671fcbbb0382b9a3685a4ace8f841cdbfcb0d19f"} Sep 30 17:52:37 crc kubenswrapper[4778]: I0930 17:52:37.661584 4778 generic.go:334] "Generic (PLEG): container finished" podID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerID="1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500" exitCode=0 Sep 30 17:52:37 crc kubenswrapper[4778]: I0930 17:52:37.661725 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sqbn" event={"ID":"f6f463b6-a6a5-470c-9413-207998f8e2bb","Type":"ContainerDied","Data":"1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500"} Sep 30 17:52:38 crc kubenswrapper[4778]: I0930 17:52:38.674312 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sqbn" event={"ID":"f6f463b6-a6a5-470c-9413-207998f8e2bb","Type":"ContainerStarted","Data":"f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e"} Sep 30 17:52:38 crc kubenswrapper[4778]: I0930 17:52:38.700694 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5sqbn" podStartSLOduration=2.225006475 podStartE2EDuration="4.70066837s" podCreationTimestamp="2025-09-30 17:52:34 +0000 UTC" firstStartedPulling="2025-09-30 17:52:35.638402948 +0000 UTC m=+2094.628300771" lastFinishedPulling="2025-09-30 17:52:38.114064853 +0000 UTC m=+2097.103962666" observedRunningTime="2025-09-30 17:52:38.692845854 +0000 UTC m=+2097.682743697" watchObservedRunningTime="2025-09-30 17:52:38.70066837 +0000 UTC m=+2097.690566183" Sep 30 17:52:44 crc kubenswrapper[4778]: I0930 17:52:44.410837 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:44 crc kubenswrapper[4778]: I0930 17:52:44.411587 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:44 crc kubenswrapper[4778]: I0930 17:52:44.482967 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:44 crc kubenswrapper[4778]: I0930 17:52:44.823248 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:44 crc kubenswrapper[4778]: I0930 17:52:44.886166 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5sqbn"] Sep 30 17:52:46 crc kubenswrapper[4778]: I0930 17:52:46.755479 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5sqbn" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="registry-server" containerID="cri-o://f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e" gracePeriod=2 Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.272534 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.403792 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-catalog-content\") pod \"f6f463b6-a6a5-470c-9413-207998f8e2bb\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.403887 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-psppn\" (UniqueName: \"kubernetes.io/projected/f6f463b6-a6a5-470c-9413-207998f8e2bb-kube-api-access-psppn\") pod \"f6f463b6-a6a5-470c-9413-207998f8e2bb\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.404086 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-utilities\") pod \"f6f463b6-a6a5-470c-9413-207998f8e2bb\" (UID: \"f6f463b6-a6a5-470c-9413-207998f8e2bb\") " Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.404871 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-utilities" (OuterVolumeSpecName: "utilities") pod "f6f463b6-a6a5-470c-9413-207998f8e2bb" (UID: "f6f463b6-a6a5-470c-9413-207998f8e2bb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.413268 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f463b6-a6a5-470c-9413-207998f8e2bb-kube-api-access-psppn" (OuterVolumeSpecName: "kube-api-access-psppn") pod "f6f463b6-a6a5-470c-9413-207998f8e2bb" (UID: "f6f463b6-a6a5-470c-9413-207998f8e2bb"). InnerVolumeSpecName "kube-api-access-psppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.483119 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6f463b6-a6a5-470c-9413-207998f8e2bb" (UID: "f6f463b6-a6a5-470c-9413-207998f8e2bb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.506373 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.506399 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6f463b6-a6a5-470c-9413-207998f8e2bb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.506412 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-psppn\" (UniqueName: \"kubernetes.io/projected/f6f463b6-a6a5-470c-9413-207998f8e2bb-kube-api-access-psppn\") on node \"crc\" DevicePath \"\"" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.771211 4778 generic.go:334] "Generic (PLEG): container finished" podID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerID="f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e" exitCode=0 Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.771329 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sqbn" event={"ID":"f6f463b6-a6a5-470c-9413-207998f8e2bb","Type":"ContainerDied","Data":"f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e"} Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.771322 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5sqbn" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.772498 4778 scope.go:117] "RemoveContainer" containerID="f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.772478 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5sqbn" event={"ID":"f6f463b6-a6a5-470c-9413-207998f8e2bb","Type":"ContainerDied","Data":"0e43ff0abd2f9c13555f4671671fcbbb0382b9a3685a4ace8f841cdbfcb0d19f"} Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.799718 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5sqbn"] Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.806227 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5sqbn"] Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.815038 4778 scope.go:117] "RemoveContainer" containerID="1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.844036 4778 scope.go:117] "RemoveContainer" containerID="806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.890489 4778 scope.go:117] "RemoveContainer" containerID="f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e" Sep 30 17:52:47 crc kubenswrapper[4778]: E0930 17:52:47.890945 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e\": container with ID starting with f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e not found: ID does not exist" containerID="f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.890986 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e"} err="failed to get container status \"f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e\": rpc error: code = NotFound desc = could not find container \"f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e\": container with ID starting with f2d576f35ccf73dc9218407f1fad017e83ce9da8496977d298ba3772ab5d706e not found: ID does not exist" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.891012 4778 scope.go:117] "RemoveContainer" containerID="1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500" Sep 30 17:52:47 crc kubenswrapper[4778]: E0930 17:52:47.891416 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500\": container with ID starting with 1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500 not found: ID does not exist" containerID="1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.891463 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500"} err="failed to get container status \"1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500\": rpc error: code = NotFound desc = could not find container \"1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500\": container with ID starting with 1cc31130edb76b5c030aa21b3e032c176d355aa6efcc2561ce95744c50758500 not found: ID does not exist" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.891496 4778 scope.go:117] "RemoveContainer" containerID="806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471" Sep 30 17:52:47 crc kubenswrapper[4778]: E0930 17:52:47.891920 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471\": container with ID starting with 806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471 not found: ID does not exist" containerID="806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471" Sep 30 17:52:47 crc kubenswrapper[4778]: I0930 17:52:47.891940 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471"} err="failed to get container status \"806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471\": rpc error: code = NotFound desc = could not find container \"806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471\": container with ID starting with 806ca626ffce7dd81866cba8a2b0f75e14d609f130d3bdb072a8cb9c12108471 not found: ID does not exist" Sep 30 17:52:49 crc kubenswrapper[4778]: I0930 17:52:49.732175 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" path="/var/lib/kubelet/pods/f6f463b6-a6a5-470c-9413-207998f8e2bb/volumes" Sep 30 17:53:14 crc kubenswrapper[4778]: I0930 17:53:14.813488 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:53:14 crc kubenswrapper[4778]: I0930 17:53:14.814765 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:53:44 crc kubenswrapper[4778]: I0930 17:53:44.811563 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:53:44 crc kubenswrapper[4778]: I0930 17:53:44.812797 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.015240 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v65hr"] Sep 30 17:54:10 crc kubenswrapper[4778]: E0930 17:54:10.017089 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="extract-utilities" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.017105 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="extract-utilities" Sep 30 17:54:10 crc kubenswrapper[4778]: E0930 17:54:10.017133 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="extract-content" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.017139 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="extract-content" Sep 30 17:54:10 crc kubenswrapper[4778]: E0930 17:54:10.017150 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="registry-server" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.017156 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="registry-server" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.017339 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f463b6-a6a5-470c-9413-207998f8e2bb" containerName="registry-server" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.018557 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.024944 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v65hr"] Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.123473 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-utilities\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.123639 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67rjb\" (UniqueName: \"kubernetes.io/projected/703b95cf-2960-46b7-869c-a14c1ccac1ba-kube-api-access-67rjb\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.123728 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-catalog-content\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.225733 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-utilities\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.225833 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67rjb\" (UniqueName: \"kubernetes.io/projected/703b95cf-2960-46b7-869c-a14c1ccac1ba-kube-api-access-67rjb\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.225911 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-catalog-content\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.226395 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-utilities\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.226544 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-catalog-content\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.254427 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67rjb\" (UniqueName: \"kubernetes.io/projected/703b95cf-2960-46b7-869c-a14c1ccac1ba-kube-api-access-67rjb\") pod \"certified-operators-v65hr\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.375268 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:10 crc kubenswrapper[4778]: I0930 17:54:10.818659 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v65hr"] Sep 30 17:54:11 crc kubenswrapper[4778]: I0930 17:54:11.592900 4778 generic.go:334] "Generic (PLEG): container finished" podID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerID="d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa" exitCode=0 Sep 30 17:54:11 crc kubenswrapper[4778]: I0930 17:54:11.592991 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65hr" event={"ID":"703b95cf-2960-46b7-869c-a14c1ccac1ba","Type":"ContainerDied","Data":"d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa"} Sep 30 17:54:11 crc kubenswrapper[4778]: I0930 17:54:11.593020 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65hr" event={"ID":"703b95cf-2960-46b7-869c-a14c1ccac1ba","Type":"ContainerStarted","Data":"9285ef5f00d27dfa393243fbde7b7c5ae34e79fbcc925a4d5935d952466683b2"} Sep 30 17:54:13 crc kubenswrapper[4778]: I0930 17:54:13.629147 4778 generic.go:334] "Generic (PLEG): container finished" podID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerID="5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd" exitCode=0 Sep 30 17:54:13 crc kubenswrapper[4778]: I0930 17:54:13.629565 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65hr" event={"ID":"703b95cf-2960-46b7-869c-a14c1ccac1ba","Type":"ContainerDied","Data":"5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd"} Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.646798 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65hr" event={"ID":"703b95cf-2960-46b7-869c-a14c1ccac1ba","Type":"ContainerStarted","Data":"b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3"} Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.668117 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v65hr" podStartSLOduration=3.214651632 podStartE2EDuration="5.668099604s" podCreationTimestamp="2025-09-30 17:54:09 +0000 UTC" firstStartedPulling="2025-09-30 17:54:11.596108986 +0000 UTC m=+2190.586006829" lastFinishedPulling="2025-09-30 17:54:14.049556988 +0000 UTC m=+2193.039454801" observedRunningTime="2025-09-30 17:54:14.662626944 +0000 UTC m=+2193.652524747" watchObservedRunningTime="2025-09-30 17:54:14.668099604 +0000 UTC m=+2193.657997407" Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.811676 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.811746 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.811806 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.812552 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"363873babd277c029b7d9966d143bc81c18d516f3abbaa8796da4f45c82bd4ee"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:54:14 crc kubenswrapper[4778]: I0930 17:54:14.812652 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://363873babd277c029b7d9966d143bc81c18d516f3abbaa8796da4f45c82bd4ee" gracePeriod=600 Sep 30 17:54:15 crc kubenswrapper[4778]: I0930 17:54:15.663368 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="363873babd277c029b7d9966d143bc81c18d516f3abbaa8796da4f45c82bd4ee" exitCode=0 Sep 30 17:54:15 crc kubenswrapper[4778]: I0930 17:54:15.663417 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"363873babd277c029b7d9966d143bc81c18d516f3abbaa8796da4f45c82bd4ee"} Sep 30 17:54:15 crc kubenswrapper[4778]: I0930 17:54:15.664127 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2"} Sep 30 17:54:15 crc kubenswrapper[4778]: I0930 17:54:15.664174 4778 scope.go:117] "RemoveContainer" containerID="62e9eb92677b4487effd1203da179a2db4b671c1f05f3361fdd90b581b3d2312" Sep 30 17:54:20 crc kubenswrapper[4778]: I0930 17:54:20.375492 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:20 crc kubenswrapper[4778]: I0930 17:54:20.375962 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:20 crc kubenswrapper[4778]: I0930 17:54:20.430049 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:20 crc kubenswrapper[4778]: I0930 17:54:20.791474 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:20 crc kubenswrapper[4778]: I0930 17:54:20.867719 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v65hr"] Sep 30 17:54:22 crc kubenswrapper[4778]: I0930 17:54:22.724191 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v65hr" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="registry-server" containerID="cri-o://b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3" gracePeriod=2 Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.152842 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.333013 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67rjb\" (UniqueName: \"kubernetes.io/projected/703b95cf-2960-46b7-869c-a14c1ccac1ba-kube-api-access-67rjb\") pod \"703b95cf-2960-46b7-869c-a14c1ccac1ba\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.333136 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-catalog-content\") pod \"703b95cf-2960-46b7-869c-a14c1ccac1ba\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.333183 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-utilities\") pod \"703b95cf-2960-46b7-869c-a14c1ccac1ba\" (UID: \"703b95cf-2960-46b7-869c-a14c1ccac1ba\") " Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.334104 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-utilities" (OuterVolumeSpecName: "utilities") pod "703b95cf-2960-46b7-869c-a14c1ccac1ba" (UID: "703b95cf-2960-46b7-869c-a14c1ccac1ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.334445 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.341831 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/703b95cf-2960-46b7-869c-a14c1ccac1ba-kube-api-access-67rjb" (OuterVolumeSpecName: "kube-api-access-67rjb") pod "703b95cf-2960-46b7-869c-a14c1ccac1ba" (UID: "703b95cf-2960-46b7-869c-a14c1ccac1ba"). InnerVolumeSpecName "kube-api-access-67rjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.389754 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "703b95cf-2960-46b7-869c-a14c1ccac1ba" (UID: "703b95cf-2960-46b7-869c-a14c1ccac1ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.435812 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67rjb\" (UniqueName: \"kubernetes.io/projected/703b95cf-2960-46b7-869c-a14c1ccac1ba-kube-api-access-67rjb\") on node \"crc\" DevicePath \"\"" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.436150 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/703b95cf-2960-46b7-869c-a14c1ccac1ba-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.738652 4778 generic.go:334] "Generic (PLEG): container finished" podID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerID="b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3" exitCode=0 Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.738702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65hr" event={"ID":"703b95cf-2960-46b7-869c-a14c1ccac1ba","Type":"ContainerDied","Data":"b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3"} Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.738731 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v65hr" event={"ID":"703b95cf-2960-46b7-869c-a14c1ccac1ba","Type":"ContainerDied","Data":"9285ef5f00d27dfa393243fbde7b7c5ae34e79fbcc925a4d5935d952466683b2"} Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.738789 4778 scope.go:117] "RemoveContainer" containerID="b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.739270 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v65hr" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.774229 4778 scope.go:117] "RemoveContainer" containerID="5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.796212 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v65hr"] Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.805199 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v65hr"] Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.806760 4778 scope.go:117] "RemoveContainer" containerID="d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.872710 4778 scope.go:117] "RemoveContainer" containerID="b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3" Sep 30 17:54:23 crc kubenswrapper[4778]: E0930 17:54:23.873286 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3\": container with ID starting with b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3 not found: ID does not exist" containerID="b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.873333 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3"} err="failed to get container status \"b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3\": rpc error: code = NotFound desc = could not find container \"b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3\": container with ID starting with b0949e8058c8230a242a1eba7af93276c7b6124fb9292e2a68af4a524336bfe3 not found: ID does not exist" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.873360 4778 scope.go:117] "RemoveContainer" containerID="5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd" Sep 30 17:54:23 crc kubenswrapper[4778]: E0930 17:54:23.873887 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd\": container with ID starting with 5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd not found: ID does not exist" containerID="5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.873942 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd"} err="failed to get container status \"5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd\": rpc error: code = NotFound desc = could not find container \"5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd\": container with ID starting with 5af38442485686f61eb0fed94c29e4d3019a72772ff54e820eaf82ae7f18b9bd not found: ID does not exist" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.873989 4778 scope.go:117] "RemoveContainer" containerID="d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa" Sep 30 17:54:23 crc kubenswrapper[4778]: E0930 17:54:23.874315 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa\": container with ID starting with d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa not found: ID does not exist" containerID="d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa" Sep 30 17:54:23 crc kubenswrapper[4778]: I0930 17:54:23.874340 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa"} err="failed to get container status \"d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa\": rpc error: code = NotFound desc = could not find container \"d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa\": container with ID starting with d2e3386bdb428ecd6f840420f9ca9ed17049ad362a9a18e38443d59e8e6535fa not found: ID does not exist" Sep 30 17:54:25 crc kubenswrapper[4778]: I0930 17:54:25.728273 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" path="/var/lib/kubelet/pods/703b95cf-2960-46b7-869c-a14c1ccac1ba/volumes" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.277046 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fpsc4"] Sep 30 17:56:10 crc kubenswrapper[4778]: E0930 17:56:10.278378 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="extract-utilities" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.278396 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="extract-utilities" Sep 30 17:56:10 crc kubenswrapper[4778]: E0930 17:56:10.278415 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="extract-content" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.278425 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="extract-content" Sep 30 17:56:10 crc kubenswrapper[4778]: E0930 17:56:10.278462 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="registry-server" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.278470 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="registry-server" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.278708 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="703b95cf-2960-46b7-869c-a14c1ccac1ba" containerName="registry-server" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.280650 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.295924 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpsc4"] Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.380168 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-utilities\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.380302 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-catalog-content\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.380357 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpbsk\" (UniqueName: \"kubernetes.io/projected/824a0b2b-aff3-471a-8b4c-327e34defdcb-kube-api-access-fpbsk\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.482130 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-catalog-content\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.482190 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpbsk\" (UniqueName: \"kubernetes.io/projected/824a0b2b-aff3-471a-8b4c-327e34defdcb-kube-api-access-fpbsk\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.482257 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-utilities\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.482777 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-utilities\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.482774 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-catalog-content\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.512887 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpbsk\" (UniqueName: \"kubernetes.io/projected/824a0b2b-aff3-471a-8b4c-327e34defdcb-kube-api-access-fpbsk\") pod \"redhat-operators-fpsc4\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:10 crc kubenswrapper[4778]: I0930 17:56:10.612290 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:11 crc kubenswrapper[4778]: I0930 17:56:11.159400 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fpsc4"] Sep 30 17:56:11 crc kubenswrapper[4778]: I0930 17:56:11.746908 4778 generic.go:334] "Generic (PLEG): container finished" podID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerID="9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0" exitCode=0 Sep 30 17:56:11 crc kubenswrapper[4778]: I0930 17:56:11.746946 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerDied","Data":"9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0"} Sep 30 17:56:11 crc kubenswrapper[4778]: I0930 17:56:11.746968 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerStarted","Data":"bcc6d9ca54d175e4a6efce6cca05c24e721ce8613c83052bbbbfdb986380695c"} Sep 30 17:56:12 crc kubenswrapper[4778]: I0930 17:56:12.756907 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerStarted","Data":"24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc"} Sep 30 17:56:13 crc kubenswrapper[4778]: I0930 17:56:13.771333 4778 generic.go:334] "Generic (PLEG): container finished" podID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerID="24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc" exitCode=0 Sep 30 17:56:13 crc kubenswrapper[4778]: I0930 17:56:13.771563 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerDied","Data":"24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc"} Sep 30 17:56:15 crc kubenswrapper[4778]: I0930 17:56:15.793378 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerStarted","Data":"db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f"} Sep 30 17:56:15 crc kubenswrapper[4778]: I0930 17:56:15.816281 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fpsc4" podStartSLOduration=3.114167703 podStartE2EDuration="5.81625631s" podCreationTimestamp="2025-09-30 17:56:10 +0000 UTC" firstStartedPulling="2025-09-30 17:56:11.74900974 +0000 UTC m=+2310.738907543" lastFinishedPulling="2025-09-30 17:56:14.451098347 +0000 UTC m=+2313.440996150" observedRunningTime="2025-09-30 17:56:15.812352059 +0000 UTC m=+2314.802249872" watchObservedRunningTime="2025-09-30 17:56:15.81625631 +0000 UTC m=+2314.806154113" Sep 30 17:56:20 crc kubenswrapper[4778]: I0930 17:56:20.613228 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:20 crc kubenswrapper[4778]: I0930 17:56:20.613903 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:20 crc kubenswrapper[4778]: I0930 17:56:20.678191 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:20 crc kubenswrapper[4778]: I0930 17:56:20.885020 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:20 crc kubenswrapper[4778]: I0930 17:56:20.966305 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpsc4"] Sep 30 17:56:22 crc kubenswrapper[4778]: I0930 17:56:22.862687 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fpsc4" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="registry-server" containerID="cri-o://db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f" gracePeriod=2 Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.333968 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.429755 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-utilities\") pod \"824a0b2b-aff3-471a-8b4c-327e34defdcb\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.430016 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpbsk\" (UniqueName: \"kubernetes.io/projected/824a0b2b-aff3-471a-8b4c-327e34defdcb-kube-api-access-fpbsk\") pod \"824a0b2b-aff3-471a-8b4c-327e34defdcb\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.430071 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-catalog-content\") pod \"824a0b2b-aff3-471a-8b4c-327e34defdcb\" (UID: \"824a0b2b-aff3-471a-8b4c-327e34defdcb\") " Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.431303 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-utilities" (OuterVolumeSpecName: "utilities") pod "824a0b2b-aff3-471a-8b4c-327e34defdcb" (UID: "824a0b2b-aff3-471a-8b4c-327e34defdcb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.432819 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.435927 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824a0b2b-aff3-471a-8b4c-327e34defdcb-kube-api-access-fpbsk" (OuterVolumeSpecName: "kube-api-access-fpbsk") pod "824a0b2b-aff3-471a-8b4c-327e34defdcb" (UID: "824a0b2b-aff3-471a-8b4c-327e34defdcb"). InnerVolumeSpecName "kube-api-access-fpbsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.528493 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "824a0b2b-aff3-471a-8b4c-327e34defdcb" (UID: "824a0b2b-aff3-471a-8b4c-327e34defdcb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.534399 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpbsk\" (UniqueName: \"kubernetes.io/projected/824a0b2b-aff3-471a-8b4c-327e34defdcb-kube-api-access-fpbsk\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.534652 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/824a0b2b-aff3-471a-8b4c-327e34defdcb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.873561 4778 generic.go:334] "Generic (PLEG): container finished" podID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerID="db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f" exitCode=0 Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.873674 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerDied","Data":"db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f"} Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.873712 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fpsc4" event={"ID":"824a0b2b-aff3-471a-8b4c-327e34defdcb","Type":"ContainerDied","Data":"bcc6d9ca54d175e4a6efce6cca05c24e721ce8613c83052bbbbfdb986380695c"} Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.873730 4778 scope.go:117] "RemoveContainer" containerID="db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.873747 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fpsc4" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.901421 4778 scope.go:117] "RemoveContainer" containerID="24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.913550 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fpsc4"] Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.925604 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fpsc4"] Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.935726 4778 scope.go:117] "RemoveContainer" containerID="9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.973733 4778 scope.go:117] "RemoveContainer" containerID="db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f" Sep 30 17:56:23 crc kubenswrapper[4778]: E0930 17:56:23.974589 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f\": container with ID starting with db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f not found: ID does not exist" containerID="db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.974694 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f"} err="failed to get container status \"db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f\": rpc error: code = NotFound desc = could not find container \"db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f\": container with ID starting with db73105da623c0a4e886981d7de90385afa7444e9d73bf57ad4e5100d77c701f not found: ID does not exist" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.974739 4778 scope.go:117] "RemoveContainer" containerID="24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc" Sep 30 17:56:23 crc kubenswrapper[4778]: E0930 17:56:23.975311 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc\": container with ID starting with 24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc not found: ID does not exist" containerID="24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.975527 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc"} err="failed to get container status \"24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc\": rpc error: code = NotFound desc = could not find container \"24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc\": container with ID starting with 24ae1171c537cab503c0ecf7dfd86f3063b72f54c37436b97f0447262baf17fc not found: ID does not exist" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.975789 4778 scope.go:117] "RemoveContainer" containerID="9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0" Sep 30 17:56:23 crc kubenswrapper[4778]: E0930 17:56:23.976283 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0\": container with ID starting with 9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0 not found: ID does not exist" containerID="9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0" Sep 30 17:56:23 crc kubenswrapper[4778]: I0930 17:56:23.976340 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0"} err="failed to get container status \"9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0\": rpc error: code = NotFound desc = could not find container \"9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0\": container with ID starting with 9d6a972eba6e0a908a2c8a6d361447f7330b9c8dbecbf876bf0bcddb7a8b62d0 not found: ID does not exist" Sep 30 17:56:25 crc kubenswrapper[4778]: I0930 17:56:25.731060 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" path="/var/lib/kubelet/pods/824a0b2b-aff3-471a-8b4c-327e34defdcb/volumes" Sep 30 17:56:44 crc kubenswrapper[4778]: I0930 17:56:44.812583 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:56:44 crc kubenswrapper[4778]: I0930 17:56:44.813531 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:57:14 crc kubenswrapper[4778]: I0930 17:57:14.812120 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:57:14 crc kubenswrapper[4778]: I0930 17:57:14.813896 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:57:44 crc kubenswrapper[4778]: I0930 17:57:44.814929 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 17:57:44 crc kubenswrapper[4778]: I0930 17:57:44.815608 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 17:57:44 crc kubenswrapper[4778]: I0930 17:57:44.815752 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 17:57:44 crc kubenswrapper[4778]: I0930 17:57:44.817097 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 17:57:44 crc kubenswrapper[4778]: I0930 17:57:44.817245 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" gracePeriod=600 Sep 30 17:57:44 crc kubenswrapper[4778]: E0930 17:57:44.953373 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:57:45 crc kubenswrapper[4778]: I0930 17:57:45.650891 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" exitCode=0 Sep 30 17:57:45 crc kubenswrapper[4778]: I0930 17:57:45.650942 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2"} Sep 30 17:57:45 crc kubenswrapper[4778]: I0930 17:57:45.651007 4778 scope.go:117] "RemoveContainer" containerID="363873babd277c029b7d9966d143bc81c18d516f3abbaa8796da4f45c82bd4ee" Sep 30 17:57:45 crc kubenswrapper[4778]: I0930 17:57:45.651746 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:57:45 crc kubenswrapper[4778]: E0930 17:57:45.652110 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:57:57 crc kubenswrapper[4778]: I0930 17:57:57.714206 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:57:57 crc kubenswrapper[4778]: E0930 17:57:57.715119 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:58:09 crc kubenswrapper[4778]: I0930 17:58:09.714295 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:58:09 crc kubenswrapper[4778]: E0930 17:58:09.715560 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:58:21 crc kubenswrapper[4778]: I0930 17:58:21.720807 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:58:21 crc kubenswrapper[4778]: E0930 17:58:21.721929 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:58:35 crc kubenswrapper[4778]: I0930 17:58:35.714084 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:58:35 crc kubenswrapper[4778]: E0930 17:58:35.715258 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:58:48 crc kubenswrapper[4778]: I0930 17:58:48.714756 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:58:48 crc kubenswrapper[4778]: E0930 17:58:48.715681 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:59:01 crc kubenswrapper[4778]: I0930 17:59:01.723390 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:59:01 crc kubenswrapper[4778]: E0930 17:59:01.724505 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:59:15 crc kubenswrapper[4778]: I0930 17:59:15.734878 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:59:15 crc kubenswrapper[4778]: E0930 17:59:15.735971 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:59:29 crc kubenswrapper[4778]: I0930 17:59:29.714344 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:59:29 crc kubenswrapper[4778]: E0930 17:59:29.715583 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:59:41 crc kubenswrapper[4778]: I0930 17:59:41.725006 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:59:41 crc kubenswrapper[4778]: E0930 17:59:41.726912 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 17:59:54 crc kubenswrapper[4778]: I0930 17:59:54.713504 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 17:59:54 crc kubenswrapper[4778]: E0930 17:59:54.714719 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.153368 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k"] Sep 30 18:00:00 crc kubenswrapper[4778]: E0930 18:00:00.154336 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="registry-server" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.154356 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="registry-server" Sep 30 18:00:00 crc kubenswrapper[4778]: E0930 18:00:00.154378 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="extract-utilities" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.154389 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="extract-utilities" Sep 30 18:00:00 crc kubenswrapper[4778]: E0930 18:00:00.154409 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="extract-content" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.154421 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="extract-content" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.154680 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="824a0b2b-aff3-471a-8b4c-327e34defdcb" containerName="registry-server" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.155369 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.157683 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.163050 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.179064 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k"] Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.295083 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x77qx\" (UniqueName: \"kubernetes.io/projected/03cea904-b454-485c-8b3c-808f68772227-kube-api-access-x77qx\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.295221 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03cea904-b454-485c-8b3c-808f68772227-secret-volume\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.295286 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03cea904-b454-485c-8b3c-808f68772227-config-volume\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.396822 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x77qx\" (UniqueName: \"kubernetes.io/projected/03cea904-b454-485c-8b3c-808f68772227-kube-api-access-x77qx\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.396963 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03cea904-b454-485c-8b3c-808f68772227-secret-volume\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.396993 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03cea904-b454-485c-8b3c-808f68772227-config-volume\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.398210 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03cea904-b454-485c-8b3c-808f68772227-config-volume\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.404827 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03cea904-b454-485c-8b3c-808f68772227-secret-volume\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.428144 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x77qx\" (UniqueName: \"kubernetes.io/projected/03cea904-b454-485c-8b3c-808f68772227-kube-api-access-x77qx\") pod \"collect-profiles-29320920-2g85k\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.485485 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:00 crc kubenswrapper[4778]: I0930 18:00:00.956888 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k"] Sep 30 18:00:01 crc kubenswrapper[4778]: I0930 18:00:01.946981 4778 generic.go:334] "Generic (PLEG): container finished" podID="03cea904-b454-485c-8b3c-808f68772227" containerID="3c8246b06ab684926efd162a546a9a5a064d2289dba9a312a7f00b0d34df085f" exitCode=0 Sep 30 18:00:01 crc kubenswrapper[4778]: I0930 18:00:01.947041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" event={"ID":"03cea904-b454-485c-8b3c-808f68772227","Type":"ContainerDied","Data":"3c8246b06ab684926efd162a546a9a5a064d2289dba9a312a7f00b0d34df085f"} Sep 30 18:00:01 crc kubenswrapper[4778]: I0930 18:00:01.947561 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" event={"ID":"03cea904-b454-485c-8b3c-808f68772227","Type":"ContainerStarted","Data":"423917a4b27815bfead9792d965fe1f9c84127cadcbea655354541bbe71f577d"} Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.364355 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.450515 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x77qx\" (UniqueName: \"kubernetes.io/projected/03cea904-b454-485c-8b3c-808f68772227-kube-api-access-x77qx\") pod \"03cea904-b454-485c-8b3c-808f68772227\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.450673 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03cea904-b454-485c-8b3c-808f68772227-secret-volume\") pod \"03cea904-b454-485c-8b3c-808f68772227\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.450797 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03cea904-b454-485c-8b3c-808f68772227-config-volume\") pod \"03cea904-b454-485c-8b3c-808f68772227\" (UID: \"03cea904-b454-485c-8b3c-808f68772227\") " Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.451454 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03cea904-b454-485c-8b3c-808f68772227-config-volume" (OuterVolumeSpecName: "config-volume") pod "03cea904-b454-485c-8b3c-808f68772227" (UID: "03cea904-b454-485c-8b3c-808f68772227"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.457077 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03cea904-b454-485c-8b3c-808f68772227-kube-api-access-x77qx" (OuterVolumeSpecName: "kube-api-access-x77qx") pod "03cea904-b454-485c-8b3c-808f68772227" (UID: "03cea904-b454-485c-8b3c-808f68772227"). InnerVolumeSpecName "kube-api-access-x77qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.457368 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03cea904-b454-485c-8b3c-808f68772227-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "03cea904-b454-485c-8b3c-808f68772227" (UID: "03cea904-b454-485c-8b3c-808f68772227"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.553647 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x77qx\" (UniqueName: \"kubernetes.io/projected/03cea904-b454-485c-8b3c-808f68772227-kube-api-access-x77qx\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.553682 4778 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/03cea904-b454-485c-8b3c-808f68772227-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.553695 4778 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/03cea904-b454-485c-8b3c-808f68772227-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.968127 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" event={"ID":"03cea904-b454-485c-8b3c-808f68772227","Type":"ContainerDied","Data":"423917a4b27815bfead9792d965fe1f9c84127cadcbea655354541bbe71f577d"} Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.968173 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423917a4b27815bfead9792d965fe1f9c84127cadcbea655354541bbe71f577d" Sep 30 18:00:03 crc kubenswrapper[4778]: I0930 18:00:03.968235 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320920-2g85k" Sep 30 18:00:04 crc kubenswrapper[4778]: I0930 18:00:04.483055 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c"] Sep 30 18:00:04 crc kubenswrapper[4778]: I0930 18:00:04.491588 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320875-66s7c"] Sep 30 18:00:05 crc kubenswrapper[4778]: I0930 18:00:05.728483 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535566fd-2a2e-43a4-94cb-dea8d1e2123b" path="/var/lib/kubelet/pods/535566fd-2a2e-43a4-94cb-dea8d1e2123b/volumes" Sep 30 18:00:07 crc kubenswrapper[4778]: I0930 18:00:07.714981 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:00:07 crc kubenswrapper[4778]: E0930 18:00:07.715991 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:00:18 crc kubenswrapper[4778]: I0930 18:00:18.713859 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:00:18 crc kubenswrapper[4778]: E0930 18:00:18.714519 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:00:29 crc kubenswrapper[4778]: I0930 18:00:29.714721 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:00:29 crc kubenswrapper[4778]: E0930 18:00:29.716168 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:00:42 crc kubenswrapper[4778]: I0930 18:00:42.715165 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:00:42 crc kubenswrapper[4778]: E0930 18:00:42.719493 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:00:47 crc kubenswrapper[4778]: I0930 18:00:47.888389 4778 scope.go:117] "RemoveContainer" containerID="b219f43da38348e2df202028bf2ab09060d1fa63e6dfc7a8fe4c44d947aa3369" Sep 30 18:00:55 crc kubenswrapper[4778]: I0930 18:00:55.713977 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:00:55 crc kubenswrapper[4778]: E0930 18:00:55.715329 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.179260 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320921-rtbks"] Sep 30 18:01:00 crc kubenswrapper[4778]: E0930 18:01:00.180943 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03cea904-b454-485c-8b3c-808f68772227" containerName="collect-profiles" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.180973 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="03cea904-b454-485c-8b3c-808f68772227" containerName="collect-profiles" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.181424 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="03cea904-b454-485c-8b3c-808f68772227" containerName="collect-profiles" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.182720 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.195547 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320921-rtbks"] Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.337591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-combined-ca-bundle\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.337767 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djsqc\" (UniqueName: \"kubernetes.io/projected/bd674a69-6b5c-4374-b729-1f40e983bea8-kube-api-access-djsqc\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.338342 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-config-data\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.338407 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-fernet-keys\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.439838 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-config-data\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.439917 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-fernet-keys\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.439965 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-combined-ca-bundle\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.440085 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djsqc\" (UniqueName: \"kubernetes.io/projected/bd674a69-6b5c-4374-b729-1f40e983bea8-kube-api-access-djsqc\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.448684 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-fernet-keys\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.449389 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-config-data\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.456904 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-combined-ca-bundle\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.475319 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djsqc\" (UniqueName: \"kubernetes.io/projected/bd674a69-6b5c-4374-b729-1f40e983bea8-kube-api-access-djsqc\") pod \"keystone-cron-29320921-rtbks\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:00 crc kubenswrapper[4778]: I0930 18:01:00.542429 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:01 crc kubenswrapper[4778]: I0930 18:01:01.030731 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320921-rtbks"] Sep 30 18:01:01 crc kubenswrapper[4778]: I0930 18:01:01.547480 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-rtbks" event={"ID":"bd674a69-6b5c-4374-b729-1f40e983bea8","Type":"ContainerStarted","Data":"65eed811e8a000d1658a5176bf4ef67858e2a718e77d15935197dca4e83e95e7"} Sep 30 18:01:01 crc kubenswrapper[4778]: I0930 18:01:01.547997 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-rtbks" event={"ID":"bd674a69-6b5c-4374-b729-1f40e983bea8","Type":"ContainerStarted","Data":"edafbb11634bd5d36dbb54e8cf4947675802fe9ef6bc5dda9242daa7d7d00729"} Sep 30 18:01:01 crc kubenswrapper[4778]: I0930 18:01:01.577271 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320921-rtbks" podStartSLOduration=1.577247329 podStartE2EDuration="1.577247329s" podCreationTimestamp="2025-09-30 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:01:01.570072643 +0000 UTC m=+2600.559970466" watchObservedRunningTime="2025-09-30 18:01:01.577247329 +0000 UTC m=+2600.567145142" Sep 30 18:01:03 crc kubenswrapper[4778]: I0930 18:01:03.580481 4778 generic.go:334] "Generic (PLEG): container finished" podID="bd674a69-6b5c-4374-b729-1f40e983bea8" containerID="65eed811e8a000d1658a5176bf4ef67858e2a718e77d15935197dca4e83e95e7" exitCode=0 Sep 30 18:01:03 crc kubenswrapper[4778]: I0930 18:01:03.580554 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-rtbks" event={"ID":"bd674a69-6b5c-4374-b729-1f40e983bea8","Type":"ContainerDied","Data":"65eed811e8a000d1658a5176bf4ef67858e2a718e77d15935197dca4e83e95e7"} Sep 30 18:01:04 crc kubenswrapper[4778]: I0930 18:01:04.942757 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.125924 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-config-data\") pod \"bd674a69-6b5c-4374-b729-1f40e983bea8\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.126044 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-fernet-keys\") pod \"bd674a69-6b5c-4374-b729-1f40e983bea8\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.126081 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-combined-ca-bundle\") pod \"bd674a69-6b5c-4374-b729-1f40e983bea8\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.126198 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djsqc\" (UniqueName: \"kubernetes.io/projected/bd674a69-6b5c-4374-b729-1f40e983bea8-kube-api-access-djsqc\") pod \"bd674a69-6b5c-4374-b729-1f40e983bea8\" (UID: \"bd674a69-6b5c-4374-b729-1f40e983bea8\") " Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.136698 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd674a69-6b5c-4374-b729-1f40e983bea8-kube-api-access-djsqc" (OuterVolumeSpecName: "kube-api-access-djsqc") pod "bd674a69-6b5c-4374-b729-1f40e983bea8" (UID: "bd674a69-6b5c-4374-b729-1f40e983bea8"). InnerVolumeSpecName "kube-api-access-djsqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.137168 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "bd674a69-6b5c-4374-b729-1f40e983bea8" (UID: "bd674a69-6b5c-4374-b729-1f40e983bea8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.178060 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd674a69-6b5c-4374-b729-1f40e983bea8" (UID: "bd674a69-6b5c-4374-b729-1f40e983bea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.186677 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-config-data" (OuterVolumeSpecName: "config-data") pod "bd674a69-6b5c-4374-b729-1f40e983bea8" (UID: "bd674a69-6b5c-4374-b729-1f40e983bea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.229554 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djsqc\" (UniqueName: \"kubernetes.io/projected/bd674a69-6b5c-4374-b729-1f40e983bea8-kube-api-access-djsqc\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.229595 4778 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.229607 4778 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.229638 4778 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd674a69-6b5c-4374-b729-1f40e983bea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.599512 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320921-rtbks" event={"ID":"bd674a69-6b5c-4374-b729-1f40e983bea8","Type":"ContainerDied","Data":"edafbb11634bd5d36dbb54e8cf4947675802fe9ef6bc5dda9242daa7d7d00729"} Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.599555 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edafbb11634bd5d36dbb54e8cf4947675802fe9ef6bc5dda9242daa7d7d00729" Sep 30 18:01:05 crc kubenswrapper[4778]: I0930 18:01:05.599658 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320921-rtbks" Sep 30 18:01:09 crc kubenswrapper[4778]: I0930 18:01:09.714668 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:01:09 crc kubenswrapper[4778]: E0930 18:01:09.715506 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:01:22 crc kubenswrapper[4778]: I0930 18:01:22.714839 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:01:22 crc kubenswrapper[4778]: E0930 18:01:22.715834 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:01:35 crc kubenswrapper[4778]: I0930 18:01:35.714544 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:01:35 crc kubenswrapper[4778]: E0930 18:01:35.715790 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:01:48 crc kubenswrapper[4778]: I0930 18:01:48.713705 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:01:48 crc kubenswrapper[4778]: E0930 18:01:48.714413 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:02:03 crc kubenswrapper[4778]: I0930 18:02:03.714551 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:02:03 crc kubenswrapper[4778]: E0930 18:02:03.715645 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:02:18 crc kubenswrapper[4778]: I0930 18:02:18.713946 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:02:18 crc kubenswrapper[4778]: E0930 18:02:18.714918 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:02:33 crc kubenswrapper[4778]: I0930 18:02:33.715363 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:02:33 crc kubenswrapper[4778]: E0930 18:02:33.716365 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:02:46 crc kubenswrapper[4778]: I0930 18:02:46.714180 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:02:47 crc kubenswrapper[4778]: I0930 18:02:47.512607 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"2a5286e9db2f7bc22ae7cbc733132d15348c9a7d4ac339ce379b1f1d56fb4677"} Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.477902 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-szfxl"] Sep 30 18:02:50 crc kubenswrapper[4778]: E0930 18:02:50.478941 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd674a69-6b5c-4374-b729-1f40e983bea8" containerName="keystone-cron" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.478961 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd674a69-6b5c-4374-b729-1f40e983bea8" containerName="keystone-cron" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.479265 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd674a69-6b5c-4374-b729-1f40e983bea8" containerName="keystone-cron" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.481302 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.489579 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szfxl"] Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.558591 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-utilities\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.558758 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-catalog-content\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.558848 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74v8\" (UniqueName: \"kubernetes.io/projected/401f2952-7592-45d3-ab8c-148b15d9d9d0-kube-api-access-r74v8\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.660805 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-utilities\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.660901 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-catalog-content\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.660953 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r74v8\" (UniqueName: \"kubernetes.io/projected/401f2952-7592-45d3-ab8c-148b15d9d9d0-kube-api-access-r74v8\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.661652 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-catalog-content\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.661682 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-utilities\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.680243 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r74v8\" (UniqueName: \"kubernetes.io/projected/401f2952-7592-45d3-ab8c-148b15d9d9d0-kube-api-access-r74v8\") pod \"community-operators-szfxl\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:50 crc kubenswrapper[4778]: I0930 18:02:50.858892 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:02:51 crc kubenswrapper[4778]: I0930 18:02:51.376133 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-szfxl"] Sep 30 18:02:51 crc kubenswrapper[4778]: I0930 18:02:51.550428 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szfxl" event={"ID":"401f2952-7592-45d3-ab8c-148b15d9d9d0","Type":"ContainerStarted","Data":"1e8a19d1cdc3c01bc41754a8bc1ef5a81cfcbd951b3a2ae6ce206e6536ba2fcc"} Sep 30 18:02:52 crc kubenswrapper[4778]: I0930 18:02:52.561030 4778 generic.go:334] "Generic (PLEG): container finished" podID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerID="e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15" exitCode=0 Sep 30 18:02:52 crc kubenswrapper[4778]: I0930 18:02:52.561343 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szfxl" event={"ID":"401f2952-7592-45d3-ab8c-148b15d9d9d0","Type":"ContainerDied","Data":"e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15"} Sep 30 18:02:52 crc kubenswrapper[4778]: I0930 18:02:52.564223 4778 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 18:02:53 crc kubenswrapper[4778]: I0930 18:02:53.574269 4778 generic.go:334] "Generic (PLEG): container finished" podID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerID="74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee" exitCode=0 Sep 30 18:02:53 crc kubenswrapper[4778]: I0930 18:02:53.574416 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szfxl" event={"ID":"401f2952-7592-45d3-ab8c-148b15d9d9d0","Type":"ContainerDied","Data":"74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee"} Sep 30 18:02:54 crc kubenswrapper[4778]: I0930 18:02:54.590713 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szfxl" event={"ID":"401f2952-7592-45d3-ab8c-148b15d9d9d0","Type":"ContainerStarted","Data":"893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23"} Sep 30 18:02:54 crc kubenswrapper[4778]: I0930 18:02:54.611897 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-szfxl" podStartSLOduration=3.040555096 podStartE2EDuration="4.611878959s" podCreationTimestamp="2025-09-30 18:02:50 +0000 UTC" firstStartedPulling="2025-09-30 18:02:52.56382528 +0000 UTC m=+2711.553723113" lastFinishedPulling="2025-09-30 18:02:54.135149173 +0000 UTC m=+2713.125046976" observedRunningTime="2025-09-30 18:02:54.6090638 +0000 UTC m=+2713.598961603" watchObservedRunningTime="2025-09-30 18:02:54.611878959 +0000 UTC m=+2713.601776762" Sep 30 18:03:00 crc kubenswrapper[4778]: I0930 18:03:00.859485 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:03:00 crc kubenswrapper[4778]: I0930 18:03:00.860767 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:03:00 crc kubenswrapper[4778]: I0930 18:03:00.936747 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:03:01 crc kubenswrapper[4778]: I0930 18:03:01.735696 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:03:01 crc kubenswrapper[4778]: I0930 18:03:01.811259 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szfxl"] Sep 30 18:03:03 crc kubenswrapper[4778]: I0930 18:03:03.680940 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-szfxl" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="registry-server" containerID="cri-o://893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23" gracePeriod=2 Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.618357 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.694304 4778 generic.go:334] "Generic (PLEG): container finished" podID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerID="893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23" exitCode=0 Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.694464 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-szfxl" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.694458 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szfxl" event={"ID":"401f2952-7592-45d3-ab8c-148b15d9d9d0","Type":"ContainerDied","Data":"893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23"} Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.694849 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-szfxl" event={"ID":"401f2952-7592-45d3-ab8c-148b15d9d9d0","Type":"ContainerDied","Data":"1e8a19d1cdc3c01bc41754a8bc1ef5a81cfcbd951b3a2ae6ce206e6536ba2fcc"} Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.694884 4778 scope.go:117] "RemoveContainer" containerID="893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.719039 4778 scope.go:117] "RemoveContainer" containerID="74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.749339 4778 scope.go:117] "RemoveContainer" containerID="e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.764013 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-catalog-content\") pod \"401f2952-7592-45d3-ab8c-148b15d9d9d0\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.764247 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-utilities\") pod \"401f2952-7592-45d3-ab8c-148b15d9d9d0\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.764352 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r74v8\" (UniqueName: \"kubernetes.io/projected/401f2952-7592-45d3-ab8c-148b15d9d9d0-kube-api-access-r74v8\") pod \"401f2952-7592-45d3-ab8c-148b15d9d9d0\" (UID: \"401f2952-7592-45d3-ab8c-148b15d9d9d0\") " Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.765994 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-utilities" (OuterVolumeSpecName: "utilities") pod "401f2952-7592-45d3-ab8c-148b15d9d9d0" (UID: "401f2952-7592-45d3-ab8c-148b15d9d9d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.770865 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/401f2952-7592-45d3-ab8c-148b15d9d9d0-kube-api-access-r74v8" (OuterVolumeSpecName: "kube-api-access-r74v8") pod "401f2952-7592-45d3-ab8c-148b15d9d9d0" (UID: "401f2952-7592-45d3-ab8c-148b15d9d9d0"). InnerVolumeSpecName "kube-api-access-r74v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.828218 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "401f2952-7592-45d3-ab8c-148b15d9d9d0" (UID: "401f2952-7592-45d3-ab8c-148b15d9d9d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.830164 4778 scope.go:117] "RemoveContainer" containerID="893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23" Sep 30 18:03:04 crc kubenswrapper[4778]: E0930 18:03:04.839814 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23\": container with ID starting with 893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23 not found: ID does not exist" containerID="893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.839912 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23"} err="failed to get container status \"893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23\": rpc error: code = NotFound desc = could not find container \"893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23\": container with ID starting with 893d7c7f7a9f5413676e5d5fc4776d4ab1541ae95789d7f0ec09c6a8d1b4fd23 not found: ID does not exist" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.839953 4778 scope.go:117] "RemoveContainer" containerID="74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee" Sep 30 18:03:04 crc kubenswrapper[4778]: E0930 18:03:04.840298 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee\": container with ID starting with 74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee not found: ID does not exist" containerID="74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.840334 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee"} err="failed to get container status \"74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee\": rpc error: code = NotFound desc = could not find container \"74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee\": container with ID starting with 74d562238f959960c9877c1ac90dd60556979dcc321565ed16360dcce23b87ee not found: ID does not exist" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.840346 4778 scope.go:117] "RemoveContainer" containerID="e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15" Sep 30 18:03:04 crc kubenswrapper[4778]: E0930 18:03:04.840587 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15\": container with ID starting with e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15 not found: ID does not exist" containerID="e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.840611 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15"} err="failed to get container status \"e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15\": rpc error: code = NotFound desc = could not find container \"e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15\": container with ID starting with e2d6f3825567153c21ec8354ba50b230b8084e96a7bfd99eaf23bc63a6df0c15 not found: ID does not exist" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.866263 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.866300 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r74v8\" (UniqueName: \"kubernetes.io/projected/401f2952-7592-45d3-ab8c-148b15d9d9d0-kube-api-access-r74v8\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:04 crc kubenswrapper[4778]: I0930 18:03:04.866312 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/401f2952-7592-45d3-ab8c-148b15d9d9d0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:05 crc kubenswrapper[4778]: I0930 18:03:05.041283 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-szfxl"] Sep 30 18:03:05 crc kubenswrapper[4778]: I0930 18:03:05.059486 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-szfxl"] Sep 30 18:03:05 crc kubenswrapper[4778]: I0930 18:03:05.732855 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" path="/var/lib/kubelet/pods/401f2952-7592-45d3-ab8c-148b15d9d9d0/volumes" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.537641 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh5sb/must-gather-mmcld"] Sep 30 18:03:19 crc kubenswrapper[4778]: E0930 18:03:19.538651 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="registry-server" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.538668 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="registry-server" Sep 30 18:03:19 crc kubenswrapper[4778]: E0930 18:03:19.538695 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="extract-utilities" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.538702 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="extract-utilities" Sep 30 18:03:19 crc kubenswrapper[4778]: E0930 18:03:19.538721 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="extract-content" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.538728 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="extract-content" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.538926 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="401f2952-7592-45d3-ab8c-148b15d9d9d0" containerName="registry-server" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.540075 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.543424 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xh5sb"/"kube-root-ca.crt" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.543730 4778 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-xh5sb"/"openshift-service-ca.crt" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.598596 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-must-gather-output\") pod \"must-gather-mmcld\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.598787 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc7x7\" (UniqueName: \"kubernetes.io/projected/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-kube-api-access-jc7x7\") pod \"must-gather-mmcld\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.600380 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xh5sb/must-gather-mmcld"] Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.700446 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc7x7\" (UniqueName: \"kubernetes.io/projected/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-kube-api-access-jc7x7\") pod \"must-gather-mmcld\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.700652 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-must-gather-output\") pod \"must-gather-mmcld\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.701135 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-must-gather-output\") pod \"must-gather-mmcld\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.718945 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc7x7\" (UniqueName: \"kubernetes.io/projected/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-kube-api-access-jc7x7\") pod \"must-gather-mmcld\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:19 crc kubenswrapper[4778]: I0930 18:03:19.857070 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:03:20 crc kubenswrapper[4778]: I0930 18:03:20.082703 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xh5sb/must-gather-mmcld"] Sep 30 18:03:20 crc kubenswrapper[4778]: I0930 18:03:20.860868 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/must-gather-mmcld" event={"ID":"84331001-fb2f-4d9f-a875-6f0ecdd92c5e","Type":"ContainerStarted","Data":"b97ffe63219018468788c313eeb0f570e0c2f4f2ce48faf7472c552260b12177"} Sep 30 18:03:24 crc kubenswrapper[4778]: I0930 18:03:24.898131 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/must-gather-mmcld" event={"ID":"84331001-fb2f-4d9f-a875-6f0ecdd92c5e","Type":"ContainerStarted","Data":"8b32c1368b232e82aafe8384a1ac58a1ff498d3216f5d99d8a5e671fa9928ea9"} Sep 30 18:03:24 crc kubenswrapper[4778]: I0930 18:03:24.898656 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/must-gather-mmcld" event={"ID":"84331001-fb2f-4d9f-a875-6f0ecdd92c5e","Type":"ContainerStarted","Data":"ffb4f73c5887984c9ea13d94cb2a53eff113729d3ca8df9f981911e51d0fb132"} Sep 30 18:03:24 crc kubenswrapper[4778]: I0930 18:03:24.926114 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xh5sb/must-gather-mmcld" podStartSLOduration=2.145989106 podStartE2EDuration="5.926079239s" podCreationTimestamp="2025-09-30 18:03:19 +0000 UTC" firstStartedPulling="2025-09-30 18:03:20.093209073 +0000 UTC m=+2739.083106876" lastFinishedPulling="2025-09-30 18:03:23.873299206 +0000 UTC m=+2742.863197009" observedRunningTime="2025-09-30 18:03:24.914943988 +0000 UTC m=+2743.904841821" watchObservedRunningTime="2025-09-30 18:03:24.926079239 +0000 UTC m=+2743.915977092" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.022990 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-zmlnq"] Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.024274 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.027159 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xh5sb"/"default-dockercfg-wnk5j" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.109104 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c69989d-5d3b-4207-bd72-3a88a16f98bf-host\") pod \"crc-debug-zmlnq\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.109170 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hpt\" (UniqueName: \"kubernetes.io/projected/4c69989d-5d3b-4207-bd72-3a88a16f98bf-kube-api-access-h9hpt\") pod \"crc-debug-zmlnq\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.211243 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c69989d-5d3b-4207-bd72-3a88a16f98bf-host\") pod \"crc-debug-zmlnq\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.211315 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9hpt\" (UniqueName: \"kubernetes.io/projected/4c69989d-5d3b-4207-bd72-3a88a16f98bf-kube-api-access-h9hpt\") pod \"crc-debug-zmlnq\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.211407 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c69989d-5d3b-4207-bd72-3a88a16f98bf-host\") pod \"crc-debug-zmlnq\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.235721 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9hpt\" (UniqueName: \"kubernetes.io/projected/4c69989d-5d3b-4207-bd72-3a88a16f98bf-kube-api-access-h9hpt\") pod \"crc-debug-zmlnq\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.347288 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:03:28 crc kubenswrapper[4778]: I0930 18:03:28.946955 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" event={"ID":"4c69989d-5d3b-4207-bd72-3a88a16f98bf","Type":"ContainerStarted","Data":"38df91e029f88473cd39e86877e847adb347f4164b3d064dd6bdbbda21358f35"} Sep 30 18:03:30 crc kubenswrapper[4778]: I0930 18:03:30.915970 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kvwm7"] Sep 30 18:03:30 crc kubenswrapper[4778]: I0930 18:03:30.918868 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:30 crc kubenswrapper[4778]: I0930 18:03:30.938419 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvwm7"] Sep 30 18:03:30 crc kubenswrapper[4778]: I0930 18:03:30.964992 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7dwx\" (UniqueName: \"kubernetes.io/projected/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-kube-api-access-j7dwx\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:30 crc kubenswrapper[4778]: I0930 18:03:30.965082 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-catalog-content\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:30 crc kubenswrapper[4778]: I0930 18:03:30.965141 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-utilities\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.066954 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-catalog-content\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.067046 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-utilities\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.067131 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7dwx\" (UniqueName: \"kubernetes.io/projected/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-kube-api-access-j7dwx\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.067428 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-catalog-content\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.067651 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-utilities\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.096490 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7dwx\" (UniqueName: \"kubernetes.io/projected/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-kube-api-access-j7dwx\") pod \"redhat-marketplace-kvwm7\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.253245 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.792032 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvwm7"] Sep 30 18:03:31 crc kubenswrapper[4778]: I0930 18:03:31.977737 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvwm7" event={"ID":"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8","Type":"ContainerStarted","Data":"d711ce3031af8ec05ec4faa923fd5f15832f1d42fa6b6c691e9cd6844342505b"} Sep 30 18:03:32 crc kubenswrapper[4778]: I0930 18:03:32.990713 4778 generic.go:334] "Generic (PLEG): container finished" podID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerID="d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558" exitCode=0 Sep 30 18:03:32 crc kubenswrapper[4778]: I0930 18:03:32.990915 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvwm7" event={"ID":"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8","Type":"ContainerDied","Data":"d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558"} Sep 30 18:03:41 crc kubenswrapper[4778]: I0930 18:03:41.059643 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" event={"ID":"4c69989d-5d3b-4207-bd72-3a88a16f98bf","Type":"ContainerStarted","Data":"d6e002574b59f2cb8d1409054b379fdc4ea7c294765fb2f7deb3ff86f1c491a5"} Sep 30 18:03:41 crc kubenswrapper[4778]: I0930 18:03:41.063533 4778 generic.go:334] "Generic (PLEG): container finished" podID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerID="e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1" exitCode=0 Sep 30 18:03:41 crc kubenswrapper[4778]: I0930 18:03:41.063565 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvwm7" event={"ID":"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8","Type":"ContainerDied","Data":"e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1"} Sep 30 18:03:41 crc kubenswrapper[4778]: I0930 18:03:41.086021 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" podStartSLOduration=0.83779134 podStartE2EDuration="13.085995347s" podCreationTimestamp="2025-09-30 18:03:28 +0000 UTC" firstStartedPulling="2025-09-30 18:03:28.376765938 +0000 UTC m=+2747.366663741" lastFinishedPulling="2025-09-30 18:03:40.624969945 +0000 UTC m=+2759.614867748" observedRunningTime="2025-09-30 18:03:41.079159022 +0000 UTC m=+2760.069056825" watchObservedRunningTime="2025-09-30 18:03:41.085995347 +0000 UTC m=+2760.075893160" Sep 30 18:03:42 crc kubenswrapper[4778]: I0930 18:03:42.076776 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvwm7" event={"ID":"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8","Type":"ContainerStarted","Data":"51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746"} Sep 30 18:03:42 crc kubenswrapper[4778]: I0930 18:03:42.110863 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kvwm7" podStartSLOduration=3.597253527 podStartE2EDuration="12.110840221s" podCreationTimestamp="2025-09-30 18:03:30 +0000 UTC" firstStartedPulling="2025-09-30 18:03:32.994132499 +0000 UTC m=+2751.984030302" lastFinishedPulling="2025-09-30 18:03:41.507719183 +0000 UTC m=+2760.497616996" observedRunningTime="2025-09-30 18:03:42.106317259 +0000 UTC m=+2761.096215052" watchObservedRunningTime="2025-09-30 18:03:42.110840221 +0000 UTC m=+2761.100738034" Sep 30 18:03:51 crc kubenswrapper[4778]: I0930 18:03:51.253557 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:51 crc kubenswrapper[4778]: I0930 18:03:51.255891 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:51 crc kubenswrapper[4778]: I0930 18:03:51.334182 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:52 crc kubenswrapper[4778]: I0930 18:03:52.237902 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:52 crc kubenswrapper[4778]: I0930 18:03:52.282582 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvwm7"] Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.177388 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kvwm7" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="registry-server" containerID="cri-o://51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746" gracePeriod=2 Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.637171 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.681193 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7dwx\" (UniqueName: \"kubernetes.io/projected/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-kube-api-access-j7dwx\") pod \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.681258 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-catalog-content\") pod \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.681457 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-utilities\") pod \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\" (UID: \"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8\") " Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.682446 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-utilities" (OuterVolumeSpecName: "utilities") pod "cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" (UID: "cf9ef283-4a2a-4567-a09c-c16e26b4d1c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.692965 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" (UID: "cf9ef283-4a2a-4567-a09c-c16e26b4d1c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.698970 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-kube-api-access-j7dwx" (OuterVolumeSpecName: "kube-api-access-j7dwx") pod "cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" (UID: "cf9ef283-4a2a-4567-a09c-c16e26b4d1c8"). InnerVolumeSpecName "kube-api-access-j7dwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.784080 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.784395 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7dwx\" (UniqueName: \"kubernetes.io/projected/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-kube-api-access-j7dwx\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:54 crc kubenswrapper[4778]: I0930 18:03:54.784406 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:03:55 crc kubenswrapper[4778]: I0930 18:03:55.188263 4778 generic.go:334] "Generic (PLEG): container finished" podID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerID="51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746" exitCode=0 Sep 30 18:03:55 crc kubenswrapper[4778]: I0930 18:03:55.188321 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvwm7" event={"ID":"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8","Type":"ContainerDied","Data":"51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746"} Sep 30 18:03:55 crc kubenswrapper[4778]: I0930 18:03:55.188385 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kvwm7" event={"ID":"cf9ef283-4a2a-4567-a09c-c16e26b4d1c8","Type":"ContainerDied","Data":"d711ce3031af8ec05ec4faa923fd5f15832f1d42fa6b6c691e9cd6844342505b"} Sep 30 18:03:55 crc kubenswrapper[4778]: I0930 18:03:55.188406 4778 scope.go:117] "RemoveContainer" containerID="51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746" Sep 30 18:03:55 crc kubenswrapper[4778]: I0930 18:03:55.188409 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kvwm7" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.836121 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvwm7"] Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.838535 4778 scope.go:117] "RemoveContainer" containerID="e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.845114 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kvwm7"] Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.861476 4778 scope.go:117] "RemoveContainer" containerID="d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.900775 4778 scope.go:117] "RemoveContainer" containerID="51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746" Sep 30 18:03:56 crc kubenswrapper[4778]: E0930 18:03:56.901754 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746\": container with ID starting with 51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746 not found: ID does not exist" containerID="51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.901794 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746"} err="failed to get container status \"51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746\": rpc error: code = NotFound desc = could not find container \"51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746\": container with ID starting with 51dff287025e20185b3a95d3d154fc6a8a326b3aeb9688d1349e1bcca15c0746 not found: ID does not exist" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.901825 4778 scope.go:117] "RemoveContainer" containerID="e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1" Sep 30 18:03:56 crc kubenswrapper[4778]: E0930 18:03:56.905845 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1\": container with ID starting with e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1 not found: ID does not exist" containerID="e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.905891 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1"} err="failed to get container status \"e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1\": rpc error: code = NotFound desc = could not find container \"e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1\": container with ID starting with e52e3c18740896afa2398f29628c613328973f461a3aecd56dd7213038e34ab1 not found: ID does not exist" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.905920 4778 scope.go:117] "RemoveContainer" containerID="d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558" Sep 30 18:03:56 crc kubenswrapper[4778]: E0930 18:03:56.906790 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558\": container with ID starting with d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558 not found: ID does not exist" containerID="d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558" Sep 30 18:03:56 crc kubenswrapper[4778]: I0930 18:03:56.906812 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558"} err="failed to get container status \"d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558\": rpc error: code = NotFound desc = could not find container \"d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558\": container with ID starting with d94829f417a367221ded18d647b1e51e4510a68d25ee9b89d010b8d4edbca558 not found: ID does not exist" Sep 30 18:03:57 crc kubenswrapper[4778]: I0930 18:03:57.725125 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" path="/var/lib/kubelet/pods/cf9ef283-4a2a-4567-a09c-c16e26b4d1c8/volumes" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.234589 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4471ee2f-0551-44ec-8808-8085a962de1f/cinder-api/0.log" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.281355 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_4471ee2f-0551-44ec-8808-8085a962de1f/cinder-api-log/0.log" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.513898 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3515e8e-89f6-4a7d-ab60-53bebbc77315/cinder-scheduler/0.log" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.571644 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3515e8e-89f6-4a7d-ab60-53bebbc77315/probe/0.log" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.715889 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-778d8bb9d7-lvxlw_ae9e27fc-d20d-4adc-9a77-8a29bb1b262b/init/0.log" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.933023 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-778d8bb9d7-lvxlw_ae9e27fc-d20d-4adc-9a77-8a29bb1b262b/init/0.log" Sep 30 18:04:22 crc kubenswrapper[4778]: I0930 18:04:22.934296 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-778d8bb9d7-lvxlw_ae9e27fc-d20d-4adc-9a77-8a29bb1b262b/dnsmasq-dns/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.096019 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d1abcff4-99b6-41ab-a5b1-fb4c36f22711/glance-httpd/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.170450 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d1abcff4-99b6-41ab-a5b1-fb4c36f22711/glance-log/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.300867 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_df6e6a3a-5259-4462-9af0-439627e7cd46/glance-httpd/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.351028 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_df6e6a3a-5259-4462-9af0-439627e7cd46/glance-log/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.571145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f9d6b5dcb-zrl5s_fcc69ce6-ef1c-42fc-a94e-daed766ec91d/horizon/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.767830 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-f9d6b5dcb-zrl5s_fcc69ce6-ef1c-42fc-a94e-daed766ec91d/horizon-log/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.797257 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-56c95856-xw6pr_45cbc907-3ec0-4124-9a97-9e84c4f09145/keystone-api/0.log" Sep 30 18:04:23 crc kubenswrapper[4778]: I0930 18:04:23.959226 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320921-rtbks_bd674a69-6b5c-4374-b729-1f40e983bea8/keystone-cron/0.log" Sep 30 18:04:24 crc kubenswrapper[4778]: I0930 18:04:24.203427 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-98b979bf7-grpws_613850bf-ac7d-4c00-bc60-873582d3d45e/neutron-api/0.log" Sep 30 18:04:24 crc kubenswrapper[4778]: I0930 18:04:24.418901 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-98b979bf7-grpws_613850bf-ac7d-4c00-bc60-873582d3d45e/neutron-httpd/0.log" Sep 30 18:04:24 crc kubenswrapper[4778]: I0930 18:04:24.744296 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_27d98d14-63d0-446f-8dcc-db21c137feb5/nova-api-log/0.log" Sep 30 18:04:24 crc kubenswrapper[4778]: I0930 18:04:24.925066 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_27d98d14-63d0-446f-8dcc-db21c137feb5/nova-api-api/0.log" Sep 30 18:04:25 crc kubenswrapper[4778]: I0930 18:04:25.287824 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_bb567d62-92bc-46d5-a998-e96a2469b117/nova-cell0-conductor-conductor/0.log" Sep 30 18:04:25 crc kubenswrapper[4778]: I0930 18:04:25.391696 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_85164e57-f621-42ce-84f4-bf54119c5bb6/nova-cell1-conductor-conductor/0.log" Sep 30 18:04:25 crc kubenswrapper[4778]: I0930 18:04:25.541657 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_11bab32a-a35b-4e80-8eec-fc3d8a8f16f7/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 18:04:25 crc kubenswrapper[4778]: I0930 18:04:25.673058 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f29fe8f-af06-4ffb-b611-af0bb9c5cebb/nova-metadata-log/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.028808 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_dab3e484-85ba-4428-9020-04c11efe96aa/nova-scheduler-scheduler/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.239582 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_02bfdd7b-c081-4815-9a55-f39fa4d0384f/mysql-bootstrap/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.478595 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_02bfdd7b-c081-4815-9a55-f39fa4d0384f/mysql-bootstrap/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.528494 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_02bfdd7b-c081-4815-9a55-f39fa4d0384f/galera/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.647265 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f29fe8f-af06-4ffb-b611-af0bb9c5cebb/nova-metadata-metadata/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.749305 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9705cb32-3888-4b76-863d-7c4dd57185bc/mysql-bootstrap/0.log" Sep 30 18:04:26 crc kubenswrapper[4778]: I0930 18:04:26.924145 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9705cb32-3888-4b76-863d-7c4dd57185bc/mysql-bootstrap/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.002260 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_9705cb32-3888-4b76-863d-7c4dd57185bc/galera/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.137853 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_9bd1cc25-8254-44e2-b64a-1711eed0609e/openstackclient/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.182015 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-79nnj_d43aaa67-2f2b-4045-80af-3593da87ed64/ovn-controller/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.430390 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-9cgft_ecc1dcfb-4095-49c5-bafb-adc94165d6c3/openstack-network-exporter/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.631025 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd6f4_9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6/ovsdb-server-init/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.801840 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd6f4_9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6/ovsdb-server-init/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.822085 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd6f4_9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6/ovs-vswitchd/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.858820 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-qd6f4_9a8e6ffe-b59c-43c2-b8f2-dec2fa786df6/ovsdb-server/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.964345 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_21244498-232a-4725-be68-da731564a70b/memcached/0.log" Sep 30 18:04:27 crc kubenswrapper[4778]: I0930 18:04:27.981720 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2e131ac5-4895-42fa-b7f5-6a37d5eafe3c/openstack-network-exporter/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.058119 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_2e131ac5-4895-42fa-b7f5-6a37d5eafe3c/ovn-northd/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.151513 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c60ce32d-deec-4c60-9192-6b090ae53773/openstack-network-exporter/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.228094 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_c60ce32d-deec-4c60-9192-6b090ae53773/ovsdbserver-nb/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.317035 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ce405643-c53c-472d-802c-a3d8fe7840a0/openstack-network-exporter/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.354573 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ce405643-c53c-472d-802c-a3d8fe7840a0/ovsdbserver-sb/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.553861 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-768b5657b-6fmpz_e9c1cfdd-d923-493f-a3b0-f75756047aeb/placement-api/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.564986 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-768b5657b-6fmpz_e9c1cfdd-d923-493f-a3b0-f75756047aeb/placement-log/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.650051 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1/setup-container/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.826543 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1/rabbitmq/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.835875 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_a46a2506-e4e5-4d7c-b19f-ebf1bb0922a1/setup-container/0.log" Sep 30 18:04:28 crc kubenswrapper[4778]: I0930 18:04:28.852474 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_227061d9-b3e7-4711-92f5-283ca4af1412/setup-container/0.log" Sep 30 18:04:29 crc kubenswrapper[4778]: I0930 18:04:29.026424 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_227061d9-b3e7-4711-92f5-283ca4af1412/setup-container/0.log" Sep 30 18:04:29 crc kubenswrapper[4778]: I0930 18:04:29.034833 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_227061d9-b3e7-4711-92f5-283ca4af1412/rabbitmq/0.log" Sep 30 18:05:02 crc kubenswrapper[4778]: I0930 18:05:02.791020 4778 generic.go:334] "Generic (PLEG): container finished" podID="4c69989d-5d3b-4207-bd72-3a88a16f98bf" containerID="d6e002574b59f2cb8d1409054b379fdc4ea7c294765fb2f7deb3ff86f1c491a5" exitCode=0 Sep 30 18:05:02 crc kubenswrapper[4778]: I0930 18:05:02.791139 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" event={"ID":"4c69989d-5d3b-4207-bd72-3a88a16f98bf","Type":"ContainerDied","Data":"d6e002574b59f2cb8d1409054b379fdc4ea7c294765fb2f7deb3ff86f1c491a5"} Sep 30 18:05:03 crc kubenswrapper[4778]: I0930 18:05:03.905123 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:05:03 crc kubenswrapper[4778]: I0930 18:05:03.941229 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-zmlnq"] Sep 30 18:05:03 crc kubenswrapper[4778]: I0930 18:05:03.948525 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-zmlnq"] Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.016443 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9hpt\" (UniqueName: \"kubernetes.io/projected/4c69989d-5d3b-4207-bd72-3a88a16f98bf-kube-api-access-h9hpt\") pod \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.016874 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c69989d-5d3b-4207-bd72-3a88a16f98bf-host\") pod \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\" (UID: \"4c69989d-5d3b-4207-bd72-3a88a16f98bf\") " Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.017006 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c69989d-5d3b-4207-bd72-3a88a16f98bf-host" (OuterVolumeSpecName: "host") pod "4c69989d-5d3b-4207-bd72-3a88a16f98bf" (UID: "4c69989d-5d3b-4207-bd72-3a88a16f98bf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.017304 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4c69989d-5d3b-4207-bd72-3a88a16f98bf-host\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.026234 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c69989d-5d3b-4207-bd72-3a88a16f98bf-kube-api-access-h9hpt" (OuterVolumeSpecName: "kube-api-access-h9hpt") pod "4c69989d-5d3b-4207-bd72-3a88a16f98bf" (UID: "4c69989d-5d3b-4207-bd72-3a88a16f98bf"). InnerVolumeSpecName "kube-api-access-h9hpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.118927 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9hpt\" (UniqueName: \"kubernetes.io/projected/4c69989d-5d3b-4207-bd72-3a88a16f98bf-kube-api-access-h9hpt\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.813724 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38df91e029f88473cd39e86877e847adb347f4164b3d064dd6bdbbda21358f35" Sep 30 18:05:04 crc kubenswrapper[4778]: I0930 18:05:04.814046 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-zmlnq" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.131255 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-v2996"] Sep 30 18:05:05 crc kubenswrapper[4778]: E0930 18:05:05.131902 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="registry-server" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.131915 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="registry-server" Sep 30 18:05:05 crc kubenswrapper[4778]: E0930 18:05:05.131925 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c69989d-5d3b-4207-bd72-3a88a16f98bf" containerName="container-00" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.131931 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c69989d-5d3b-4207-bd72-3a88a16f98bf" containerName="container-00" Sep 30 18:05:05 crc kubenswrapper[4778]: E0930 18:05:05.131957 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="extract-content" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.131963 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="extract-content" Sep 30 18:05:05 crc kubenswrapper[4778]: E0930 18:05:05.131977 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="extract-utilities" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.131983 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="extract-utilities" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.132135 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c69989d-5d3b-4207-bd72-3a88a16f98bf" containerName="container-00" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.132149 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf9ef283-4a2a-4567-a09c-c16e26b4d1c8" containerName="registry-server" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.132697 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.134900 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xh5sb"/"default-dockercfg-wnk5j" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.239371 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-host\") pod \"crc-debug-v2996\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.239721 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7wfb\" (UniqueName: \"kubernetes.io/projected/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-kube-api-access-p7wfb\") pod \"crc-debug-v2996\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.341913 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7wfb\" (UniqueName: \"kubernetes.io/projected/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-kube-api-access-p7wfb\") pod \"crc-debug-v2996\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.342195 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-host\") pod \"crc-debug-v2996\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.342330 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-host\") pod \"crc-debug-v2996\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.361583 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7wfb\" (UniqueName: \"kubernetes.io/projected/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-kube-api-access-p7wfb\") pod \"crc-debug-v2996\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.469388 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.750292 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c69989d-5d3b-4207-bd72-3a88a16f98bf" path="/var/lib/kubelet/pods/4c69989d-5d3b-4207-bd72-3a88a16f98bf/volumes" Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.821702 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-v2996" event={"ID":"305c9b30-ce0f-49c6-b03a-bf39cfa62d96","Type":"ContainerStarted","Data":"6ffea7a2a3331963f8f42ed94e1a7a97b6e864b2971c867ec5060204d7fea49a"} Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.821744 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-v2996" event={"ID":"305c9b30-ce0f-49c6-b03a-bf39cfa62d96","Type":"ContainerStarted","Data":"7c7fcb1985f15ee3b2616c0f0acf3fe484ba23464c338df6925d375173985918"} Sep 30 18:05:05 crc kubenswrapper[4778]: I0930 18:05:05.847499 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xh5sb/crc-debug-v2996" podStartSLOduration=0.84747539 podStartE2EDuration="847.47539ms" podCreationTimestamp="2025-09-30 18:05:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 18:05:05.83348754 +0000 UTC m=+2844.823385343" watchObservedRunningTime="2025-09-30 18:05:05.84747539 +0000 UTC m=+2844.837373193" Sep 30 18:05:06 crc kubenswrapper[4778]: I0930 18:05:06.830826 4778 generic.go:334] "Generic (PLEG): container finished" podID="305c9b30-ce0f-49c6-b03a-bf39cfa62d96" containerID="6ffea7a2a3331963f8f42ed94e1a7a97b6e864b2971c867ec5060204d7fea49a" exitCode=0 Sep 30 18:05:06 crc kubenswrapper[4778]: I0930 18:05:06.831187 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-v2996" event={"ID":"305c9b30-ce0f-49c6-b03a-bf39cfa62d96","Type":"ContainerDied","Data":"6ffea7a2a3331963f8f42ed94e1a7a97b6e864b2971c867ec5060204d7fea49a"} Sep 30 18:05:07 crc kubenswrapper[4778]: I0930 18:05:07.942444 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:07 crc kubenswrapper[4778]: I0930 18:05:07.974182 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7wfb\" (UniqueName: \"kubernetes.io/projected/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-kube-api-access-p7wfb\") pod \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " Sep 30 18:05:07 crc kubenswrapper[4778]: I0930 18:05:07.974230 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-host\") pod \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\" (UID: \"305c9b30-ce0f-49c6-b03a-bf39cfa62d96\") " Sep 30 18:05:07 crc kubenswrapper[4778]: I0930 18:05:07.974420 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-host" (OuterVolumeSpecName: "host") pod "305c9b30-ce0f-49c6-b03a-bf39cfa62d96" (UID: "305c9b30-ce0f-49c6-b03a-bf39cfa62d96"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:05:07 crc kubenswrapper[4778]: I0930 18:05:07.974685 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-host\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:07 crc kubenswrapper[4778]: I0930 18:05:07.980611 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-kube-api-access-p7wfb" (OuterVolumeSpecName: "kube-api-access-p7wfb") pod "305c9b30-ce0f-49c6-b03a-bf39cfa62d96" (UID: "305c9b30-ce0f-49c6-b03a-bf39cfa62d96"). InnerVolumeSpecName "kube-api-access-p7wfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:08 crc kubenswrapper[4778]: I0930 18:05:08.075973 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7wfb\" (UniqueName: \"kubernetes.io/projected/305c9b30-ce0f-49c6-b03a-bf39cfa62d96-kube-api-access-p7wfb\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:08 crc kubenswrapper[4778]: I0930 18:05:08.846984 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-v2996" event={"ID":"305c9b30-ce0f-49c6-b03a-bf39cfa62d96","Type":"ContainerDied","Data":"7c7fcb1985f15ee3b2616c0f0acf3fe484ba23464c338df6925d375173985918"} Sep 30 18:05:08 crc kubenswrapper[4778]: I0930 18:05:08.847031 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c7fcb1985f15ee3b2616c0f0acf3fe484ba23464c338df6925d375173985918" Sep 30 18:05:08 crc kubenswrapper[4778]: I0930 18:05:08.847109 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-v2996" Sep 30 18:05:10 crc kubenswrapper[4778]: I0930 18:05:10.362804 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-v2996"] Sep 30 18:05:10 crc kubenswrapper[4778]: I0930 18:05:10.372220 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-v2996"] Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.599025 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-49sq7"] Sep 30 18:05:11 crc kubenswrapper[4778]: E0930 18:05:11.599748 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="305c9b30-ce0f-49c6-b03a-bf39cfa62d96" containerName="container-00" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.599762 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="305c9b30-ce0f-49c6-b03a-bf39cfa62d96" containerName="container-00" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.600031 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="305c9b30-ce0f-49c6-b03a-bf39cfa62d96" containerName="container-00" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.601664 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.609245 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49sq7"] Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.618200 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-n7qrp"] Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.619939 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.628222 4778 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-xh5sb"/"default-dockercfg-wnk5j" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.723450 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="305c9b30-ce0f-49c6-b03a-bf39cfa62d96" path="/var/lib/kubelet/pods/305c9b30-ce0f-49c6-b03a-bf39cfa62d96/volumes" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.742959 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxttx\" (UniqueName: \"kubernetes.io/projected/9a105625-764d-42e3-b78e-af4f717e4d63-kube-api-access-fxttx\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.743041 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-host\") pod \"crc-debug-n7qrp\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.743176 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-utilities\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.743246 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-catalog-content\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.743299 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb7qs\" (UniqueName: \"kubernetes.io/projected/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-kube-api-access-lb7qs\") pod \"crc-debug-n7qrp\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.844898 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-host\") pod \"crc-debug-n7qrp\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.845025 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-host\") pod \"crc-debug-n7qrp\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.845078 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-utilities\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.845176 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-catalog-content\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.845244 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb7qs\" (UniqueName: \"kubernetes.io/projected/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-kube-api-access-lb7qs\") pod \"crc-debug-n7qrp\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.845294 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxttx\" (UniqueName: \"kubernetes.io/projected/9a105625-764d-42e3-b78e-af4f717e4d63-kube-api-access-fxttx\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.845945 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-utilities\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.846224 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-catalog-content\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.866211 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb7qs\" (UniqueName: \"kubernetes.io/projected/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-kube-api-access-lb7qs\") pod \"crc-debug-n7qrp\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.869300 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxttx\" (UniqueName: \"kubernetes.io/projected/9a105625-764d-42e3-b78e-af4f717e4d63-kube-api-access-fxttx\") pod \"certified-operators-49sq7\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.931015 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:11 crc kubenswrapper[4778]: I0930 18:05:11.937607 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:12 crc kubenswrapper[4778]: W0930 18:05:12.478555 4778 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a105625_764d_42e3_b78e_af4f717e4d63.slice/crio-f5da521ee9eda19295777e1e4bf87fa031ec0f2861a4c0d65ac400ffd73d9241 WatchSource:0}: Error finding container f5da521ee9eda19295777e1e4bf87fa031ec0f2861a4c0d65ac400ffd73d9241: Status 404 returned error can't find the container with id f5da521ee9eda19295777e1e4bf87fa031ec0f2861a4c0d65ac400ffd73d9241 Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.478932 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-49sq7"] Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.896459 4778 generic.go:334] "Generic (PLEG): container finished" podID="9a105625-764d-42e3-b78e-af4f717e4d63" containerID="4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236" exitCode=0 Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.896740 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerDied","Data":"4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236"} Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.897938 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerStarted","Data":"f5da521ee9eda19295777e1e4bf87fa031ec0f2861a4c0d65ac400ffd73d9241"} Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.901978 4778 generic.go:334] "Generic (PLEG): container finished" podID="6567ba32-7599-4aa9-a6a8-350c3fa6df3f" containerID="4eab2413278d8dd67e6f989ccf815cc916e1562e23bdf6d14ff905575a5aa779" exitCode=0 Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.902035 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" event={"ID":"6567ba32-7599-4aa9-a6a8-350c3fa6df3f","Type":"ContainerDied","Data":"4eab2413278d8dd67e6f989ccf815cc916e1562e23bdf6d14ff905575a5aa779"} Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.902091 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" event={"ID":"6567ba32-7599-4aa9-a6a8-350c3fa6df3f","Type":"ContainerStarted","Data":"ad4dc413f53d4fa409dc97d96b1d584fb5f08933f2867a5bbb5d569ba512b051"} Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.962548 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-n7qrp"] Sep 30 18:05:12 crc kubenswrapper[4778]: I0930 18:05:12.972906 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh5sb/crc-debug-n7qrp"] Sep 30 18:05:13 crc kubenswrapper[4778]: I0930 18:05:13.920353 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerStarted","Data":"af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a"} Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.022449 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.189053 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-host\") pod \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.189325 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-host" (OuterVolumeSpecName: "host") pod "6567ba32-7599-4aa9-a6a8-350c3fa6df3f" (UID: "6567ba32-7599-4aa9-a6a8-350c3fa6df3f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.189677 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb7qs\" (UniqueName: \"kubernetes.io/projected/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-kube-api-access-lb7qs\") pod \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\" (UID: \"6567ba32-7599-4aa9-a6a8-350c3fa6df3f\") " Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.190265 4778 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-host\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.194685 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-kube-api-access-lb7qs" (OuterVolumeSpecName: "kube-api-access-lb7qs") pod "6567ba32-7599-4aa9-a6a8-350c3fa6df3f" (UID: "6567ba32-7599-4aa9-a6a8-350c3fa6df3f"). InnerVolumeSpecName "kube-api-access-lb7qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.291654 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb7qs\" (UniqueName: \"kubernetes.io/projected/6567ba32-7599-4aa9-a6a8-350c3fa6df3f-kube-api-access-lb7qs\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.818382 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.818485 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.855008 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/util/0.log" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.929955 4778 generic.go:334] "Generic (PLEG): container finished" podID="9a105625-764d-42e3-b78e-af4f717e4d63" containerID="af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a" exitCode=0 Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.930059 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerDied","Data":"af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a"} Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.931818 4778 scope.go:117] "RemoveContainer" containerID="4eab2413278d8dd67e6f989ccf815cc916e1562e23bdf6d14ff905575a5aa779" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.932024 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/crc-debug-n7qrp" Sep 30 18:05:14 crc kubenswrapper[4778]: I0930 18:05:14.988197 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/util/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.047110 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/pull/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.078581 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/pull/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.237536 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/util/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.270198 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/pull/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.270993 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_53b081c75a08da3b0c48a12efaceb245f87acd066a088ac5229b65eb0en95hb_87932d5f-678b-4115-8fa4-37a1fa008062/extract/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.459170 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-8w47p_d0633e5b-e292-47b8-81e6-b752204748a9/kube-rbac-proxy/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.478322 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-8w47p_d0633e5b-e292-47b8-81e6-b752204748a9/manager/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.541429 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-49dlx_08b975a6-8387-4c2b-ab76-c34f30ac2f02/kube-rbac-proxy/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.681340 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-49dlx_08b975a6-8387-4c2b-ab76-c34f30ac2f02/manager/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.726128 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6567ba32-7599-4aa9-a6a8-350c3fa6df3f" path="/var/lib/kubelet/pods/6567ba32-7599-4aa9-a6a8-350c3fa6df3f/volumes" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.757742 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-28qfs_9235e0ee-34ca-4be2-af88-ef695afe5224/kube-rbac-proxy/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.778050 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-28qfs_9235e0ee-34ca-4be2-af88-ef695afe5224/manager/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.895909 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-84vkl_e84606fe-316d-493e-8159-d9707c2f2a47/kube-rbac-proxy/0.log" Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.942552 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerStarted","Data":"ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc"} Sep 30 18:05:15 crc kubenswrapper[4778]: I0930 18:05:15.962701 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-49sq7" podStartSLOduration=2.526666727 podStartE2EDuration="4.962685631s" podCreationTimestamp="2025-09-30 18:05:11 +0000 UTC" firstStartedPulling="2025-09-30 18:05:12.901747124 +0000 UTC m=+2851.891644967" lastFinishedPulling="2025-09-30 18:05:15.337766048 +0000 UTC m=+2854.327663871" observedRunningTime="2025-09-30 18:05:15.95946892 +0000 UTC m=+2854.949366713" watchObservedRunningTime="2025-09-30 18:05:15.962685631 +0000 UTC m=+2854.952583434" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.079060 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-84vkl_e84606fe-316d-493e-8159-d9707c2f2a47/manager/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.100807 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-rhgbx_62a10d57-e47f-45de-9588-a0abb103b727/manager/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.168329 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-rhgbx_62a10d57-e47f-45de-9588-a0abb103b727/kube-rbac-proxy/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.276795 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-jhqnp_66ff4558-e436-4870-a8b0-61124b0322f7/kube-rbac-proxy/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.382545 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-jhqnp_66ff4558-e436-4870-a8b0-61124b0322f7/manager/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.508149 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-kx62l_aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5/kube-rbac-proxy/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.586895 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-kx62l_aaf4666e-b07a-4195-ab1d-9fb5e0d00fb5/manager/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.597235 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-w69kc_0bbfd0e9-af02-499b-98eb-c27f5eaed971/kube-rbac-proxy/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.728686 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-w69kc_0bbfd0e9-af02-499b-98eb-c27f5eaed971/manager/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.792185 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-jn9dq_6ecf488a-bace-4ef9-bec7-aa29f7dd85e8/kube-rbac-proxy/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.890380 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-jn9dq_6ecf488a-bace-4ef9-bec7-aa29f7dd85e8/manager/0.log" Sep 30 18:05:16 crc kubenswrapper[4778]: I0930 18:05:16.992509 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-ldb8b_849e7052-8c4b-469c-8dbe-3d6d3099ed7d/manager/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.013105 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-ldb8b_849e7052-8c4b-469c-8dbe-3d6d3099ed7d/kube-rbac-proxy/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.174954 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-x9xzq_7ae19188-a950-4136-b02f-264588920c60/kube-rbac-proxy/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.201888 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-x9xzq_7ae19188-a950-4136-b02f-264588920c60/manager/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.304208 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-n7ssv_3fbf26ad-2f08-4506-8d77-c0162b8792f5/kube-rbac-proxy/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.411428 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-n7ssv_3fbf26ad-2f08-4506-8d77-c0162b8792f5/manager/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.659555 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-9s9jn_4cdf1c9f-3579-41f3-9a10-05c1c8a5f241/kube-rbac-proxy/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.764864 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-9s9jn_4cdf1c9f-3579-41f3-9a10-05c1c8a5f241/manager/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.861953 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-ztxmv_89a96b51-b353-47e4-8242-9de93451c210/manager/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.868112 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-ztxmv_89a96b51-b353-47e4-8242-9de93451c210/kube-rbac-proxy/0.log" Sep 30 18:05:17 crc kubenswrapper[4778]: I0930 18:05:17.963054 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx_77616b28-3707-4b13-a2ca-efc265a63676/kube-rbac-proxy/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.002955 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-55f4778d9fnwfnx_77616b28-3707-4b13-a2ca-efc265a63676/manager/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.073125 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5468b64689-5nf52_bcfb480c-3a1f-4a9d-83a0-183c46be742a/kube-rbac-proxy/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.199595 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d8fdfd448-zzqnn_ded817e4-c278-4c72-8a31-2826f9a59292/kube-rbac-proxy/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.391908 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-d8fdfd448-zzqnn_ded817e4-c278-4c72-8a31-2826f9a59292/operator/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.418926 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-d4xhf_bcfc85f5-7dbe-4093-8177-a1413ebfbaca/registry-server/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.508870 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-m4rzv_3fd2b550-65c1-4163-b735-2e517b37c34c/kube-rbac-proxy/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.637596 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-m4rzv_3fd2b550-65c1-4163-b735-2e517b37c34c/manager/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.708465 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-vfvgx_60314fa2-575c-42dc-beb7-d37d3ba69cac/kube-rbac-proxy/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.713955 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-vfvgx_60314fa2-575c-42dc-beb7-d37d3ba69cac/manager/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.839698 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5468b64689-5nf52_bcfb480c-3a1f-4a9d-83a0-183c46be742a/manager/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.854118 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-hrfgk_dd54c570-121e-4daa-b264-fbb40a606478/operator/0.log" Sep 30 18:05:18 crc kubenswrapper[4778]: I0930 18:05:18.978988 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-vtw6g_b159e466-544e-47b1-9617-d1ffcec28b1c/kube-rbac-proxy/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.039675 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-vtw6g_b159e466-544e-47b1-9617-d1ffcec28b1c/manager/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.046000 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7bdb6cfb74-8hfcx_9f70afd9-8a72-4776-8586-6afee1834e3f/kube-rbac-proxy/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.111158 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7bdb6cfb74-8hfcx_9f70afd9-8a72-4776-8586-6afee1834e3f/manager/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.217078 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-j4dhd_c32b9788-c6d5-43ae-a777-ce4aeb3cabf0/kube-rbac-proxy/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.231062 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-j4dhd_c32b9788-c6d5-43ae-a777-ce4aeb3cabf0/manager/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.299183 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-z6mzw_84123306-eb7b-45d1-b542-f1f831949fb4/kube-rbac-proxy/0.log" Sep 30 18:05:19 crc kubenswrapper[4778]: I0930 18:05:19.382757 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-76669f99c-z6mzw_84123306-eb7b-45d1-b542-f1f831949fb4/manager/0.log" Sep 30 18:05:21 crc kubenswrapper[4778]: I0930 18:05:21.932137 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:21 crc kubenswrapper[4778]: I0930 18:05:21.932486 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:21 crc kubenswrapper[4778]: I0930 18:05:21.983254 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:22 crc kubenswrapper[4778]: I0930 18:05:22.055991 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:22 crc kubenswrapper[4778]: I0930 18:05:22.222667 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49sq7"] Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.012904 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-49sq7" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="registry-server" containerID="cri-o://ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc" gracePeriod=2 Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.462949 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.562989 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxttx\" (UniqueName: \"kubernetes.io/projected/9a105625-764d-42e3-b78e-af4f717e4d63-kube-api-access-fxttx\") pod \"9a105625-764d-42e3-b78e-af4f717e4d63\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.563056 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-catalog-content\") pod \"9a105625-764d-42e3-b78e-af4f717e4d63\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.563220 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-utilities\") pod \"9a105625-764d-42e3-b78e-af4f717e4d63\" (UID: \"9a105625-764d-42e3-b78e-af4f717e4d63\") " Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.564241 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-utilities" (OuterVolumeSpecName: "utilities") pod "9a105625-764d-42e3-b78e-af4f717e4d63" (UID: "9a105625-764d-42e3-b78e-af4f717e4d63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.570939 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a105625-764d-42e3-b78e-af4f717e4d63-kube-api-access-fxttx" (OuterVolumeSpecName: "kube-api-access-fxttx") pod "9a105625-764d-42e3-b78e-af4f717e4d63" (UID: "9a105625-764d-42e3-b78e-af4f717e4d63"). InnerVolumeSpecName "kube-api-access-fxttx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.665454 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxttx\" (UniqueName: \"kubernetes.io/projected/9a105625-764d-42e3-b78e-af4f717e4d63-kube-api-access-fxttx\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.665491 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.703734 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9a105625-764d-42e3-b78e-af4f717e4d63" (UID: "9a105625-764d-42e3-b78e-af4f717e4d63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:05:24 crc kubenswrapper[4778]: I0930 18:05:24.766601 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a105625-764d-42e3-b78e-af4f717e4d63-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.022111 4778 generic.go:334] "Generic (PLEG): container finished" podID="9a105625-764d-42e3-b78e-af4f717e4d63" containerID="ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc" exitCode=0 Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.022158 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-49sq7" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.022174 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerDied","Data":"ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc"} Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.023457 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-49sq7" event={"ID":"9a105625-764d-42e3-b78e-af4f717e4d63","Type":"ContainerDied","Data":"f5da521ee9eda19295777e1e4bf87fa031ec0f2861a4c0d65ac400ffd73d9241"} Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.023485 4778 scope.go:117] "RemoveContainer" containerID="ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.039671 4778 scope.go:117] "RemoveContainer" containerID="af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.067238 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-49sq7"] Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.077234 4778 scope.go:117] "RemoveContainer" containerID="4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.077489 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-49sq7"] Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.105687 4778 scope.go:117] "RemoveContainer" containerID="ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc" Sep 30 18:05:25 crc kubenswrapper[4778]: E0930 18:05:25.106116 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc\": container with ID starting with ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc not found: ID does not exist" containerID="ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.106159 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc"} err="failed to get container status \"ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc\": rpc error: code = NotFound desc = could not find container \"ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc\": container with ID starting with ef2914e7f95359f16fc36720dd2eba89d3d35846290df1e33be666641cd62dfc not found: ID does not exist" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.106187 4778 scope.go:117] "RemoveContainer" containerID="af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a" Sep 30 18:05:25 crc kubenswrapper[4778]: E0930 18:05:25.106611 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a\": container with ID starting with af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a not found: ID does not exist" containerID="af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.106646 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a"} err="failed to get container status \"af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a\": rpc error: code = NotFound desc = could not find container \"af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a\": container with ID starting with af48b288899d1803d09d91c642ba1bb6298062f65240920074a0b4c9baa4119a not found: ID does not exist" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.106659 4778 scope.go:117] "RemoveContainer" containerID="4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236" Sep 30 18:05:25 crc kubenswrapper[4778]: E0930 18:05:25.108959 4778 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236\": container with ID starting with 4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236 not found: ID does not exist" containerID="4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.108981 4778 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236"} err="failed to get container status \"4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236\": rpc error: code = NotFound desc = could not find container \"4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236\": container with ID starting with 4636ea4cf846f010e1af8cd879bc5863ddff7cbaa116249c2cbbd148ca70f236 not found: ID does not exist" Sep 30 18:05:25 crc kubenswrapper[4778]: I0930 18:05:25.726557 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" path="/var/lib/kubelet/pods/9a105625-764d-42e3-b78e-af4f717e4d63/volumes" Sep 30 18:05:34 crc kubenswrapper[4778]: I0930 18:05:34.759390 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-b8229_ce48deed-422c-411b-946d-30a87d293815/control-plane-machine-set-operator/0.log" Sep 30 18:05:34 crc kubenswrapper[4778]: I0930 18:05:34.926224 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-whtc5_0de525d2-5fd2-4fd3-9524-3a5505955417/kube-rbac-proxy/0.log" Sep 30 18:05:34 crc kubenswrapper[4778]: I0930 18:05:34.931312 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-whtc5_0de525d2-5fd2-4fd3-9524-3a5505955417/machine-api-operator/0.log" Sep 30 18:05:44 crc kubenswrapper[4778]: I0930 18:05:44.812087 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:05:44 crc kubenswrapper[4778]: I0930 18:05:44.812547 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:05:46 crc kubenswrapper[4778]: I0930 18:05:46.437925 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-bkzbw_a646c0c3-0f15-434d-a414-f1523b29aba5/cert-manager-controller/0.log" Sep 30 18:05:46 crc kubenswrapper[4778]: I0930 18:05:46.636328 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-nslws_e372bf84-9d99-45cc-9225-7ad37b0c60b8/cert-manager-cainjector/0.log" Sep 30 18:05:46 crc kubenswrapper[4778]: I0930 18:05:46.687457 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-vzjnq_0803c4d3-0db6-48be-bc51-a6f24b97ed36/cert-manager-webhook/0.log" Sep 30 18:05:58 crc kubenswrapper[4778]: I0930 18:05:58.023200 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-hvhll_ac265e30-ad2e-426f-a657-35ca285dc557/nmstate-console-plugin/0.log" Sep 30 18:05:58 crc kubenswrapper[4778]: I0930 18:05:58.199298 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tlb8j_87413a2b-414e-464c-9806-ea029bd57019/kube-rbac-proxy/0.log" Sep 30 18:05:58 crc kubenswrapper[4778]: I0930 18:05:58.200808 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-hkkh6_5afb8964-3c8d-4da3-a57e-e8db0aae4b6d/nmstate-handler/0.log" Sep 30 18:05:58 crc kubenswrapper[4778]: I0930 18:05:58.249813 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tlb8j_87413a2b-414e-464c-9806-ea029bd57019/nmstate-metrics/0.log" Sep 30 18:05:58 crc kubenswrapper[4778]: I0930 18:05:58.405298 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-ktwgv_d0be6779-453e-42b4-b20c-85d16706300f/nmstate-operator/0.log" Sep 30 18:05:58 crc kubenswrapper[4778]: I0930 18:05:58.411877 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-zzvns_69e5d000-ba92-4853-9c50-db1681a3f87a/nmstate-webhook/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.266195 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-k845h_6ac78c57-42f5-4d55-98a0-74b1c3b4beb3/kube-rbac-proxy/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.345583 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-k845h_6ac78c57-42f5-4d55-98a0-74b1c3b4beb3/controller/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.464361 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-frr-files/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.585482 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-frr-files/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.619092 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-reloader/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.644338 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-metrics/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.695044 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-reloader/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.879700 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-reloader/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.889689 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-frr-files/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.909990 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-metrics/0.log" Sep 30 18:06:11 crc kubenswrapper[4778]: I0930 18:06:11.936670 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-metrics/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.099202 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-frr-files/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.104135 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-reloader/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.109462 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/cp-metrics/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.115875 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/controller/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.286689 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/kube-rbac-proxy-frr/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.304780 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/kube-rbac-proxy/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.324591 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/frr-metrics/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.480052 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/reloader/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.520445 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-p5b64_c2bdec16-ac5d-4456-9862-220cd1ee1d40/frr-k8s-webhook-server/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.702267 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5759cd8585-r49vw_e6d40185-14db-4566-b50d-1c05eef2b841/manager/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.867047 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65cdcfb5d5-65ppt_560fc21e-88c9-48b3-8077-5ee5d1056690/webhook-server/0.log" Sep 30 18:06:12 crc kubenswrapper[4778]: I0930 18:06:12.969936 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8h2sm_c9e84483-49a9-42d3-b721-929a58feda93/kube-rbac-proxy/0.log" Sep 30 18:06:13 crc kubenswrapper[4778]: I0930 18:06:13.153039 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-26pmr_841759dc-7794-4849-8d7a-a16b1674d011/frr/0.log" Sep 30 18:06:13 crc kubenswrapper[4778]: I0930 18:06:13.386702 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-8h2sm_c9e84483-49a9-42d3-b721-929a58feda93/speaker/0.log" Sep 30 18:06:14 crc kubenswrapper[4778]: I0930 18:06:14.812148 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:06:14 crc kubenswrapper[4778]: I0930 18:06:14.812503 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:06:14 crc kubenswrapper[4778]: I0930 18:06:14.812547 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 18:06:14 crc kubenswrapper[4778]: I0930 18:06:14.813205 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a5286e9db2f7bc22ae7cbc733132d15348c9a7d4ac339ce379b1f1d56fb4677"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:06:14 crc kubenswrapper[4778]: I0930 18:06:14.813256 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://2a5286e9db2f7bc22ae7cbc733132d15348c9a7d4ac339ce379b1f1d56fb4677" gracePeriod=600 Sep 30 18:06:15 crc kubenswrapper[4778]: I0930 18:06:15.441476 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="2a5286e9db2f7bc22ae7cbc733132d15348c9a7d4ac339ce379b1f1d56fb4677" exitCode=0 Sep 30 18:06:15 crc kubenswrapper[4778]: I0930 18:06:15.441516 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"2a5286e9db2f7bc22ae7cbc733132d15348c9a7d4ac339ce379b1f1d56fb4677"} Sep 30 18:06:15 crc kubenswrapper[4778]: I0930 18:06:15.442041 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerStarted","Data":"f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a"} Sep 30 18:06:15 crc kubenswrapper[4778]: I0930 18:06:15.442072 4778 scope.go:117] "RemoveContainer" containerID="5511073a0e6b477b35f1c85466bf5a6e5af0ed9a1319e55ba7104ea6632137f2" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.146383 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/util/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.287433 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/pull/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.290414 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/util/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.331828 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/pull/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.476994 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/util/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.517761 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/extract/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.526744 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcdwnk7_f131134d-496f-4a7b-849b-89d1c3100208/pull/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.653231 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/extract-utilities/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.829810 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/extract-utilities/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.833592 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/extract-content/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.835212 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/extract-content/0.log" Sep 30 18:06:25 crc kubenswrapper[4778]: I0930 18:06:25.986186 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/extract-utilities/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.008780 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/extract-content/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.277290 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/extract-utilities/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.383859 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lx7lx_2f2dbd49-d11a-4af8-8241-89981ad46467/registry-server/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.445356 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/extract-utilities/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.484821 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/extract-content/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.529810 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/extract-content/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.648297 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/extract-utilities/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.650269 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/extract-content/0.log" Sep 30 18:06:26 crc kubenswrapper[4778]: I0930 18:06:26.841304 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/util/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.061994 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/pull/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.063207 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/pull/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.100009 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/util/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.211508 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ls2z_b10c0547-5d08-46e5-bea1-53e2129b7f3a/registry-server/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.314652 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/util/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.318429 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/extract/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.332158 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d965t2vr_0dfd6413-a4b2-4598-bd14-4e9b332e8221/pull/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.537115 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-9nzgv_d46eace9-9047-4db1-acb9-1a588ab49434/marketplace-operator/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.573909 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/extract-utilities/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.686818 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/extract-content/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.696846 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/extract-utilities/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.742830 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/extract-content/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.860723 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/extract-utilities/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.902718 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/extract-content/0.log" Sep 30 18:06:27 crc kubenswrapper[4778]: I0930 18:06:27.991195 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9gtp4_23443d75-478c-4f20-b3fd-3eceb05a37d9/registry-server/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.059429 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/extract-utilities/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.212715 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/extract-content/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.214608 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/extract-content/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.219741 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/extract-utilities/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.378251 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/extract-utilities/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.398710 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/extract-content/0.log" Sep 30 18:06:28 crc kubenswrapper[4778]: I0930 18:06:28.704917 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xsmps_8227d8db-e2f9-44dc-a41a-efb4088be2fa/registry-server/0.log" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.873338 4778 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89jlb"] Sep 30 18:07:15 crc kubenswrapper[4778]: E0930 18:07:15.874660 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="extract-content" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.874681 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="extract-content" Sep 30 18:07:15 crc kubenswrapper[4778]: E0930 18:07:15.874701 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6567ba32-7599-4aa9-a6a8-350c3fa6df3f" containerName="container-00" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.874713 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="6567ba32-7599-4aa9-a6a8-350c3fa6df3f" containerName="container-00" Sep 30 18:07:15 crc kubenswrapper[4778]: E0930 18:07:15.874739 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="registry-server" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.874754 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="registry-server" Sep 30 18:07:15 crc kubenswrapper[4778]: E0930 18:07:15.874778 4778 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="extract-utilities" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.874791 4778 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="extract-utilities" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.875043 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a105625-764d-42e3-b78e-af4f717e4d63" containerName="registry-server" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.875065 4778 memory_manager.go:354] "RemoveStaleState removing state" podUID="6567ba32-7599-4aa9-a6a8-350c3fa6df3f" containerName="container-00" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.876839 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:15 crc kubenswrapper[4778]: I0930 18:07:15.891919 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89jlb"] Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.011189 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-catalog-content\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.011362 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-utilities\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.011413 4778 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fwz\" (UniqueName: \"kubernetes.io/projected/cea95c66-6f88-4fa4-970a-3f43e6dcec16-kube-api-access-79fwz\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.113395 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-utilities\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.113457 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79fwz\" (UniqueName: \"kubernetes.io/projected/cea95c66-6f88-4fa4-970a-3f43e6dcec16-kube-api-access-79fwz\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.113540 4778 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-catalog-content\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.114041 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-catalog-content\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.114259 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-utilities\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.145447 4778 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fwz\" (UniqueName: \"kubernetes.io/projected/cea95c66-6f88-4fa4-970a-3f43e6dcec16-kube-api-access-79fwz\") pod \"redhat-operators-89jlb\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.220399 4778 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.714177 4778 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89jlb"] Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.977260 4778 generic.go:334] "Generic (PLEG): container finished" podID="cea95c66-6f88-4fa4-970a-3f43e6dcec16" containerID="cf0535a652c18e96426f6727e05cd62dfa02a41615fcdef58830bf2eda9d9a92" exitCode=0 Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.977454 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerDied","Data":"cf0535a652c18e96426f6727e05cd62dfa02a41615fcdef58830bf2eda9d9a92"} Sep 30 18:07:16 crc kubenswrapper[4778]: I0930 18:07:16.978820 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerStarted","Data":"d2edf5830f3fe7b66f507f207d0de5e4af327b7ccd7b3dae88d21897ae85a829"} Sep 30 18:07:17 crc kubenswrapper[4778]: I0930 18:07:17.989726 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerStarted","Data":"95399348c08b8f21b57013b7a0fe982afbfaaff6e9c33e41dca56c63fbc4da0f"} Sep 30 18:07:19 crc kubenswrapper[4778]: I0930 18:07:19.001454 4778 generic.go:334] "Generic (PLEG): container finished" podID="cea95c66-6f88-4fa4-970a-3f43e6dcec16" containerID="95399348c08b8f21b57013b7a0fe982afbfaaff6e9c33e41dca56c63fbc4da0f" exitCode=0 Sep 30 18:07:19 crc kubenswrapper[4778]: I0930 18:07:19.001523 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerDied","Data":"95399348c08b8f21b57013b7a0fe982afbfaaff6e9c33e41dca56c63fbc4da0f"} Sep 30 18:07:20 crc kubenswrapper[4778]: I0930 18:07:20.012238 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerStarted","Data":"01c77fa7996514123618fc20f9abc1ba3ba5f6c1809c616627dd7110cc00cc69"} Sep 30 18:07:26 crc kubenswrapper[4778]: I0930 18:07:26.221036 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:26 crc kubenswrapper[4778]: I0930 18:07:26.221479 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:26 crc kubenswrapper[4778]: I0930 18:07:26.279060 4778 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:26 crc kubenswrapper[4778]: I0930 18:07:26.303723 4778 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89jlb" podStartSLOduration=8.618596518 podStartE2EDuration="11.303707801s" podCreationTimestamp="2025-09-30 18:07:15 +0000 UTC" firstStartedPulling="2025-09-30 18:07:16.979080521 +0000 UTC m=+2975.968978324" lastFinishedPulling="2025-09-30 18:07:19.664191804 +0000 UTC m=+2978.654089607" observedRunningTime="2025-09-30 18:07:20.035540657 +0000 UTC m=+2979.025438460" watchObservedRunningTime="2025-09-30 18:07:26.303707801 +0000 UTC m=+2985.293605614" Sep 30 18:07:27 crc kubenswrapper[4778]: I0930 18:07:27.149129 4778 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:27 crc kubenswrapper[4778]: I0930 18:07:27.222261 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89jlb"] Sep 30 18:07:29 crc kubenswrapper[4778]: I0930 18:07:29.108162 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89jlb" podUID="cea95c66-6f88-4fa4-970a-3f43e6dcec16" containerName="registry-server" containerID="cri-o://01c77fa7996514123618fc20f9abc1ba3ba5f6c1809c616627dd7110cc00cc69" gracePeriod=2 Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.126482 4778 generic.go:334] "Generic (PLEG): container finished" podID="cea95c66-6f88-4fa4-970a-3f43e6dcec16" containerID="01c77fa7996514123618fc20f9abc1ba3ba5f6c1809c616627dd7110cc00cc69" exitCode=0 Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.126531 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerDied","Data":"01c77fa7996514123618fc20f9abc1ba3ba5f6c1809c616627dd7110cc00cc69"} Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.126560 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89jlb" event={"ID":"cea95c66-6f88-4fa4-970a-3f43e6dcec16","Type":"ContainerDied","Data":"d2edf5830f3fe7b66f507f207d0de5e4af327b7ccd7b3dae88d21897ae85a829"} Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.126575 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2edf5830f3fe7b66f507f207d0de5e4af327b7ccd7b3dae88d21897ae85a829" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.151660 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.259545 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79fwz\" (UniqueName: \"kubernetes.io/projected/cea95c66-6f88-4fa4-970a-3f43e6dcec16-kube-api-access-79fwz\") pod \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.259604 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-utilities\") pod \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.259796 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-catalog-content\") pod \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\" (UID: \"cea95c66-6f88-4fa4-970a-3f43e6dcec16\") " Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.260712 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-utilities" (OuterVolumeSpecName: "utilities") pod "cea95c66-6f88-4fa4-970a-3f43e6dcec16" (UID: "cea95c66-6f88-4fa4-970a-3f43e6dcec16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.276867 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cea95c66-6f88-4fa4-970a-3f43e6dcec16-kube-api-access-79fwz" (OuterVolumeSpecName: "kube-api-access-79fwz") pod "cea95c66-6f88-4fa4-970a-3f43e6dcec16" (UID: "cea95c66-6f88-4fa4-970a-3f43e6dcec16"). InnerVolumeSpecName "kube-api-access-79fwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.362178 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79fwz\" (UniqueName: \"kubernetes.io/projected/cea95c66-6f88-4fa4-970a-3f43e6dcec16-kube-api-access-79fwz\") on node \"crc\" DevicePath \"\"" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.362221 4778 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.372177 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cea95c66-6f88-4fa4-970a-3f43e6dcec16" (UID: "cea95c66-6f88-4fa4-970a-3f43e6dcec16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:07:30 crc kubenswrapper[4778]: I0930 18:07:30.464198 4778 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cea95c66-6f88-4fa4-970a-3f43e6dcec16-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 18:07:31 crc kubenswrapper[4778]: I0930 18:07:31.133772 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89jlb" Sep 30 18:07:31 crc kubenswrapper[4778]: I0930 18:07:31.175402 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89jlb"] Sep 30 18:07:31 crc kubenswrapper[4778]: I0930 18:07:31.180458 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89jlb"] Sep 30 18:07:31 crc kubenswrapper[4778]: I0930 18:07:31.727867 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cea95c66-6f88-4fa4-970a-3f43e6dcec16" path="/var/lib/kubelet/pods/cea95c66-6f88-4fa4-970a-3f43e6dcec16/volumes" Sep 30 18:08:07 crc kubenswrapper[4778]: I0930 18:08:07.505216 4778 generic.go:334] "Generic (PLEG): container finished" podID="84331001-fb2f-4d9f-a875-6f0ecdd92c5e" containerID="ffb4f73c5887984c9ea13d94cb2a53eff113729d3ca8df9f981911e51d0fb132" exitCode=0 Sep 30 18:08:07 crc kubenswrapper[4778]: I0930 18:08:07.505363 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xh5sb/must-gather-mmcld" event={"ID":"84331001-fb2f-4d9f-a875-6f0ecdd92c5e","Type":"ContainerDied","Data":"ffb4f73c5887984c9ea13d94cb2a53eff113729d3ca8df9f981911e51d0fb132"} Sep 30 18:08:07 crc kubenswrapper[4778]: I0930 18:08:07.506576 4778 scope.go:117] "RemoveContainer" containerID="ffb4f73c5887984c9ea13d94cb2a53eff113729d3ca8df9f981911e51d0fb132" Sep 30 18:08:07 crc kubenswrapper[4778]: I0930 18:08:07.898100 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xh5sb_must-gather-mmcld_84331001-fb2f-4d9f-a875-6f0ecdd92c5e/gather/0.log" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.200221 4778 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xh5sb/must-gather-mmcld"] Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.201158 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-xh5sb/must-gather-mmcld" podUID="84331001-fb2f-4d9f-a875-6f0ecdd92c5e" containerName="copy" containerID="cri-o://8b32c1368b232e82aafe8384a1ac58a1ff498d3216f5d99d8a5e671fa9928ea9" gracePeriod=2 Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.212793 4778 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xh5sb/must-gather-mmcld"] Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.582547 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xh5sb_must-gather-mmcld_84331001-fb2f-4d9f-a875-6f0ecdd92c5e/copy/0.log" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.583768 4778 generic.go:334] "Generic (PLEG): container finished" podID="84331001-fb2f-4d9f-a875-6f0ecdd92c5e" containerID="8b32c1368b232e82aafe8384a1ac58a1ff498d3216f5d99d8a5e671fa9928ea9" exitCode=143 Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.583827 4778 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b97ffe63219018468788c313eeb0f570e0c2f4f2ce48faf7472c552260b12177" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.609797 4778 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xh5sb_must-gather-mmcld_84331001-fb2f-4d9f-a875-6f0ecdd92c5e/copy/0.log" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.610442 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.749287 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-must-gather-output\") pod \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.749396 4778 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc7x7\" (UniqueName: \"kubernetes.io/projected/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-kube-api-access-jc7x7\") pod \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\" (UID: \"84331001-fb2f-4d9f-a875-6f0ecdd92c5e\") " Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.758917 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-kube-api-access-jc7x7" (OuterVolumeSpecName: "kube-api-access-jc7x7") pod "84331001-fb2f-4d9f-a875-6f0ecdd92c5e" (UID: "84331001-fb2f-4d9f-a875-6f0ecdd92c5e"). InnerVolumeSpecName "kube-api-access-jc7x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.852598 4778 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc7x7\" (UniqueName: \"kubernetes.io/projected/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-kube-api-access-jc7x7\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.868474 4778 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "84331001-fb2f-4d9f-a875-6f0ecdd92c5e" (UID: "84331001-fb2f-4d9f-a875-6f0ecdd92c5e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 18:08:15 crc kubenswrapper[4778]: I0930 18:08:15.954267 4778 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/84331001-fb2f-4d9f-a875-6f0ecdd92c5e-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 18:08:16 crc kubenswrapper[4778]: I0930 18:08:16.590326 4778 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xh5sb/must-gather-mmcld" Sep 30 18:08:17 crc kubenswrapper[4778]: I0930 18:08:17.730049 4778 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84331001-fb2f-4d9f-a875-6f0ecdd92c5e" path="/var/lib/kubelet/pods/84331001-fb2f-4d9f-a875-6f0ecdd92c5e/volumes" Sep 30 18:08:44 crc kubenswrapper[4778]: I0930 18:08:44.819594 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:08:44 crc kubenswrapper[4778]: I0930 18:08:44.820267 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:09:14 crc kubenswrapper[4778]: I0930 18:09:14.811691 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:09:14 crc kubenswrapper[4778]: I0930 18:09:14.812220 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:09:44 crc kubenswrapper[4778]: I0930 18:09:44.812382 4778 patch_prober.go:28] interesting pod/machine-config-daemon-f5fmb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 18:09:44 crc kubenswrapper[4778]: I0930 18:09:44.813224 4778 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 18:09:44 crc kubenswrapper[4778]: I0930 18:09:44.813309 4778 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" Sep 30 18:09:44 crc kubenswrapper[4778]: I0930 18:09:44.814985 4778 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a"} pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 18:09:44 crc kubenswrapper[4778]: I0930 18:09:44.815125 4778 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerName="machine-config-daemon" containerID="cri-o://f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a" gracePeriod=600 Sep 30 18:09:44 crc kubenswrapper[4778]: E0930 18:09:44.953193 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:09:45 crc kubenswrapper[4778]: I0930 18:09:45.403268 4778 generic.go:334] "Generic (PLEG): container finished" podID="ac448347-b650-429e-9e31-f8f9b7565f6e" containerID="f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a" exitCode=0 Sep 30 18:09:45 crc kubenswrapper[4778]: I0930 18:09:45.403319 4778 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" event={"ID":"ac448347-b650-429e-9e31-f8f9b7565f6e","Type":"ContainerDied","Data":"f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a"} Sep 30 18:09:45 crc kubenswrapper[4778]: I0930 18:09:45.403354 4778 scope.go:117] "RemoveContainer" containerID="2a5286e9db2f7bc22ae7cbc733132d15348c9a7d4ac339ce379b1f1d56fb4677" Sep 30 18:09:45 crc kubenswrapper[4778]: I0930 18:09:45.404589 4778 scope.go:117] "RemoveContainer" containerID="f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a" Sep 30 18:09:45 crc kubenswrapper[4778]: E0930 18:09:45.405239 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e" Sep 30 18:09:51 crc kubenswrapper[4778]: I0930 18:09:51.499688 4778 scope.go:117] "RemoveContainer" containerID="8b32c1368b232e82aafe8384a1ac58a1ff498d3216f5d99d8a5e671fa9928ea9" Sep 30 18:09:51 crc kubenswrapper[4778]: I0930 18:09:51.537860 4778 scope.go:117] "RemoveContainer" containerID="d6e002574b59f2cb8d1409054b379fdc4ea7c294765fb2f7deb3ff86f1c491a5" Sep 30 18:09:51 crc kubenswrapper[4778]: I0930 18:09:51.615690 4778 scope.go:117] "RemoveContainer" containerID="ffb4f73c5887984c9ea13d94cb2a53eff113729d3ca8df9f981911e51d0fb132" Sep 30 18:09:59 crc kubenswrapper[4778]: I0930 18:09:59.713840 4778 scope.go:117] "RemoveContainer" containerID="f835b78108e6c88fc507ec89d37fa501fbc67c0d1f1a43fb81eae881220b484a" Sep 30 18:09:59 crc kubenswrapper[4778]: E0930 18:09:59.714746 4778 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-f5fmb_openshift-machine-config-operator(ac448347-b650-429e-9e31-f8f9b7565f6e)\"" pod="openshift-machine-config-operator/machine-config-daemon-f5fmb" podUID="ac448347-b650-429e-9e31-f8f9b7565f6e"